Skip to content

“Dude, what am I gonna do with a touchscreen? It’s crazy. I can’t feel anything.”

March 26, 2025

When mobile phones with flat touchscreens started becoming pervasive, it wasn’t clear exactly how people who were blind or low vision would use these devices. Research led by CREATE associate director Jacob O. Wobbrock may have influenced the direction that major technology companies like Apple and Google took to address this issue.

The iPhone 1 with a touchscreen — but no accessibility features for people who were blind — was released in September 2007. A few years later, Tim Paulding, who is blind, ran his fingers across the smooth glass screen of an iPod Touch for the first time. He was working as a counselor at a summer camp for blind kids outside Grand Rapids, Michigan, when a friend handed him the then top-of-the-line iPod.

“I’m like, ‘Dude, what am I gonna do with a touchscreen?’ It’s crazy. I can’t feel anything,” said Paulding.

Paulding was familiar with Symbian and Windows mobile phones, which featured keyboards with raised buttons. “I loved them — I could type so fast on those keyboards,” he said.

Then his friend showed him an innovation on the new device. “You drag your finger around and it reads what’s under your finger,” Paulding said. “I just thought it was really amazing. Now, I use something like that every day.”

Today, iPhones and Android phones come standard with these built-in touch-based screen readers. The screen reader helps people who are blind or visually impaired use the devices by reading aloud the text that appears on the screen and allowing gestures to interact with the app. Apple’s version is called VoiceOver and Android’s is called TalkBack.

“It’s rare for academic projects to have quite this line of sight from their academic origins to widespread industry adoption. This is about as close of a case that ever happens.”

Jacob O. Wobbrock, founding Co-Director of CREATE; professor in the UW iSchool

Jacob O. Wobbrock, a 40-something white man with short hair, a beard, and glasses. He is smiling in front of a white board.

Slide Rule research: early influencer of ubiquitous touch screen technology?

On Wobbrock’s team, Shaun Kane ― then an iSchool Ph.D. student and now a Google Research scientist and a CREATE Advisory Board member ― and Jeffrey Bigham, CSE Ph.D. ’09, developed one of the first touch-based screen readers, called Slide Rule, in 2007. 

“The iPhone really sparked a conversation about what mobile devices should be like and what are good ways to interact with them,” Kane said. “These smartphones were attractive devices. They were fun. People were excited about them. They certainly had a coolness about them.

Kane has a disability that limits movement with his hand. When he started on Slide Rule, he didn’t know anyone who was blind, but felt his research was “disability agnostic.”

When the iPhone arrived, Kane ruminated on how to overcome its accessibility issues. Then, he started talking to people. “I was really motivated by this as a problem where there isn’t an obvious solution,” said Kane, who now works at Google Research.

His idea was to create an app on the iPhone that visually impaired users could open that would read information on the screen and allow the user to interact with the information with a series of gestures. 

Wobbrock encouraged Kane to work on this idea. Apple initially didn’t allow third-party programmers to create apps for the iPhone. In 2007, Kane figured out how to do it within a couple of months.

“Shaun Kane, brilliant as he is, found a way to essentially jailbreak the phone,” said Wobbrock, joking that doing so voided the warranty.

Most of the people with whom Kane discussed the issue felt he was taking the wrong approach. Some felt that an attachment with buttons would be the correct course. Others argued that they needed to convince Apple to produce a version of the iPhone with tactile buttons.

“Well, it’s much easier to change software than it is to convince people to change hardware, right?” Kane said.

A couple of months later, Kane, Wobbrock and Bigham developed Slide Rule, a screen reader that contained an email app, a music player, and a contact list.

Kane is proud of the approach of Slide Rule and learned a valuable lesson from the project, something he’s relied on ever since. If someone downplays your idea, keep going until you decide for yourself whether the idea would work. 

“It’s definitely a project that I am known for,” Kane said. “Often, people will say, ‘Oh, we read this paper in class.’ That’s really nice.”

Kane posted a video of the Slide Rule project on YouTube on May 14, 2008. Kane, Wobbrock and Bigham’s paper on their project, Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction, was published in October 2008. 

"Keep going until you decide for yourself whether the idea will work."

Shaun Kane has grey hair and a grey beard and bright green glasses. In this image he is posing in front of a frozen lake in Iceland.

~Shaun Kane

VoiceOver and TalkBack come online

Apple introduced VoiceOver for the first time in June 2009 on the iPhone 3GS. Wobbrock said he and Kane looked at VoiceOver soon after it was released.

“The resemblance to Slide Rule was striking in terms of the gestures and design,” Wobbrock said. “Of course, VoiceOver was a full-fledged industry product. It had features, preferences and polish that you have as a commercial product that a research project doesn’t.”

Wobbrock learned that Apple engineers were familiar with his team’s work. The reason? An email from an Apple engineer.

The engineer wrote to Bigham’s advisor — and CREATE’s Director for Education Emeritus — Richard Ladner: “We definitely read through the existing literature before starting. I can say we were certainly aware of this project. We were quite excited to see the [Slide Rule] video when it popped up.” Ladner has kept the identity of the engineer anonymous all these years.

Did Slide Rule influence VoiceOver and TalkBack? Or was it great minds think alike? Kane shies away from that debate.

“You should ask other people about that,” Kane said. “I stay out of that, because I’ve heard differing accounts. I think as soon as you see a device that’s transformative in this way, with a new kind of interface, we can all start to see what the accessibility problems might be.”

A 2019 Impact Award

In 2019, the Slide Rule research team won the SIGACCESS ASSETS Paper Impact Award, which honors a paper at least 10 years old that has had significant impact on information technology that addresses the needs of people with disabilities. 

Kathleen McCoy was the chairperson of the committee that selected the iSchool paper for the award. She said, at the time the award was given, it was perhaps the most influential ASSETS paper ever published. She still uses the paper in her own classes to teach how to write about accessibility. 

“It’s a beautiful paper for the field of human-computer interaction and accessibility, going all the way from a formative study of what would you want to do with this phone for people who are blind to having ideas of how to fix that,” McCoy said.

Was Apple influenced by the iSchool’s research? She points to the timeline of when the iSchool team posted its video and published its paper the year before Apple first released VoiceOver on the iPhone.

“Just that it had the same features, the same gestures, I think that’s pretty good evidence that there was some influence going on,” McCoy said. 

She thinks the iSchool research made a lasting difference in the lives of people who are blind. She called it a snowball effect, allowing iPhone users to access other apps that enable even more accessibility.

“What would the world have looked like if someone hadn’t figured this out?” McCoy said. 

From first swipe to empowered independence

For his part, Paulding, whose life changed with touch-based screen readers, recently listened to the Slide Rule video that Kane posted in 2008. 

“It sounded like a screen reader on an iPhone to me,” Paulding said.

Paulding was born with a condition called congenital rubella syndrome after his mother was exposed to German measles when she was pregnant with him. He has no right eye and lost vision in his left eye when he was about 30.

Paulding has taught other people with visual disabilities how to use VoiceOver. Based in Spokane, Paulding works as an orientation and mobility specialist at nonprofit, and CREATE Community Partner, Lighthouse for the Blind.

Technology is near the top of the list, if not the top, for helping people with visual disabilities live an independent life, he said. 

“What Apple has done with accessibility has really revolutionized what you can do as a person who’s blind using a smartphone,” Paulding said.

He points to apps that have improved accessibility. Soundscape offers people who are blind or visually limited information about nearby businesses or the street grid. An app called OKO – AI Copilot for the Blind can be used at intersections without an audible pedestrian signal to tell whether the signal says “Walk” or “Don’t Walk.” 

Mainstream apps such as Google Maps and Apple Maps help guide people who are blind or low-vision through neighborhoods.

“The power of being able to have apps that do things for you, that read things for you, where you can get information and knowledge, that kind of power promotes independence in a huge way,” Paulding said. “It’s massive.”


This article was excerpted from the UW iSchool article by Jim Davis.