At about this time in 2007, Apple released what some have called the most successful single product in tech history: the iPhone. The tech world swooned and people lined up to get one. But if you had a disability – especially if you were blind – the iPhone’s touch screen was just a cold, unresponsive piece of glass. A lot of people felt locked out of the iPhone tech revolution.
But two years after the first iPhones hit the streets – 10 years ago this week – everything changed because the iPhone learned to talk; that’s when it first became “accessible.”
Texas Standard’s Shelly Brisbin has been exploring that history in her new audio documentary “36 Seconds That Changed Everything: How the iPhone Learned to Talk.”
To be accessible, a device like the iPhone needs to have features that allow a person with a disability to use it, even if that person can’t see or touch the screen the way someone without a disability would.
“If the way you use your phone is to tap and swipe on a piece of glass, you need to be able to see where to do that, or your fingers have to have the motor control to do it,” Brisbin says.
But the original iPhone didn’t have any accessibility features. Brisbin had a personal stake in what was missing.
“I was actually in the hall when Steve Jobs announced the iPhone, covering the Macworld conference,” Brisbin says. “And I sat there wondering if I’d ever be able to use this thing I had to write about. I have low vision, which means I need to make text on screens bigger to see it, and I turn the brightness way down, too.”
Brisbin says that a lot of people who are blind, and who were really unhappy with their phones then, were hoping that Apple, which had added speech software in its computers a couple of years before, would give the phone a spoken interface.
Brisbin talked with a number of people who followed accessible technology at the time the iPhone was announced. One was Jonathan Mosen.
“It was very clear to me that market share was moving quickly to the iPhone, and everybody was talking about the iPhone,” he said. “And I remember picking up a friend’s iPhone and feeling this blank piece of glass, essentially, with just a button on the bottom of the glass, and thinking, ‘Man, we are going to be locked out of this thing and it’s a real concern.'”
To use a cellphone, or a computer for that matter, a blind person needs software called a screen reader. That’s the voice that tells the user where he or she is, onscreen, and reads websites or documents aloud. A blind person could get a smartphone from Samsung or Nokia, and then would have to buy screen reader software for it.
“And that would run you $250!” Brisbin says. “Basically, the phone, plus the screen reader, cost about the same as an iPhone at that time,” Brisbin says.
Individuals and blindness organizations let Apple know they wanted an accessible iPhone. There were even threats of lawsuits. But Apple, which usually keeps quiet about its plans, didn’t make any promises.
“In the spring of 2009, there were some leaks, and some people were kind of told on the down-low that something big was coming for accessibility,” Brisbin says. “And at Apple’s Worldwide Developers Conference in June , they finally announced accessibility for the iPhone.”
Brisbin says she took the name for her documentary from the announcement an Apple executive made from the stage that day.
“Yes! 36 seconds is the amount of time it took to announce VoiceOver, the screen reader software, and a couple of features for folks with low vision as well,” she says. “It sounds kind of awkward, and it’s hard to hear as the revolutionary thing it turned out to be.”
For the people with blindness who hated their phones, and really wanted to use an iPhone, the announcement was consequential.
Brisbin asked Cara Quinn, who went on to make her living as an iPhone app developer, what she felt that day.
“Shock and awe. Amazement – I was very emotional. That whole day was spent for me being very emotional. It makes me emotional just thinking about it now,” Quinn said.
With VoiceOver installed, any blind person could pick up any iPhone – the same phone anyone else would use – and not have to pay extra or use different software or worry that an app wouldn’t work correctly.
Brisbin says developers started creating apps that did specific things that were useful to people who are blind or have low vision. Quinn told her about that.
“You know, when you can bring up an app, and you can show it something in print, and almost immediately it will tell you what it is, or tell you the lights are on or the lights are off, and this is all coming from one device that you hold in your hand. And that, 10 years ago, was not possible,” Quinn said.
Brisbin’s audio documentary, “36 Seconds That Changed Everything” is online now.
Copy edited by Caroline Covington.