Skip to content

“36 Seconds” Transcript

Posted in Uncategorized

This post contains all of the material from “36 Seconds That Changed Everything.,” an audio documentary by Shelly Brisbin.

“This is one device. And we are calling it iPhone. Today, today, Apple is going to reinvent the phone.” That’s how Steve Jobs introduced the iPHone.

In 2007, the iPhone took the tech world by storm. And I was there for Steve Jobs’ big announcement. Here was a tiny device that combined a cell phone, an iPod and a link to the Internet – and you connected to it all, solely via a touch screen. And six months after Jobs announced the phone, thousands of eager buyers got their hands on it – many of them had stood in line for hours, to pay hundreds of dollars for a rectangle of glass and plastic. The iPhone has been called the most successful single product in technology history. And it has spawned an economy of app developers, and a lot of would-be imitators, too.

But when those lines formed around Apple stores, and news outlets devoted special coverage to the hottest thing in mobile tech – ever – one group of potential fans remained on the sidelines. They had the five hundred dollar price of admission – or many did. But for them, the iPhone was a disappointment, a step backward, a slap in the face, even. The cold, smooth piece of glass with its flawless looks, and choice of cheery ringtones locked these enthusiasts out.

If you had a disability, the iPhone parade passed you by.

And no wonder. How could a deaf person understand someone on the other end of a phone call? How could a blind person know where to tap on a touch screen? How could a person with a motor disability press the Home button?

In 2007, even those who were being left out asked these questions. Even people with disabilities wondered how and whether the door to progress would ever open for them.

But this is not a story of exclusion. This is a story of how everything changed, exactly two years later – how a device noted for its visual design learned to talk – how a group of people who were initially shut out, achieved not only equality with their non disabled peers, but a measure of independence that no one, not even the designers at Apple – could have truly anticipated. On June 19 2009, the iPhone became accessible.

Today, the smartphone has come so far that it’s easy to take it for granted. Most of us carry one, and depend on it to stay in touch, remain productive on the road, and even find our way along that road to our next destination. That’s just as true for those of us who are blind, have a hearing loss, or experience physical disabilities. And in addition to the thousands of apps that an iPhone users with a disability can use just the way anyone else would, a slew of clever tools designed to meet particular needs have made our phones indispensable in unique ways.

We use the camera to identify objects or colors. We read restaurant menus. We navigate safely by foot and transit. We hear more of what’s happening in our environment. We learn in ways that make the most sense, based on our abilities.

We thrive.

And this technology, whether held in one hand, or mounted on a wheelchair, streamed through hearing aids or displayed onscreen in big, bold letters, is the very same technology You use to make your life better.

Before There Was An iPhone

To understand how far we’ve come, we need to travel back to the mid-aughts, before the iPhone. Cell phones – for pretty much everyone – were just phones. Some had Internet access, but many didn’t, especially in the United States, which was slower than other parts of the world to adopt truly smart phone tech. Early IPhone buyers, then, weren’t just craving something pretty or trendy, though iPhones were certainly both of these. With an iPhone, many held the Internet in their hand for the first time.

And it was the iPhone’s most striking feature – the touch screen – that locked people with a variety of disabilities out.

Jonathan Mosen, a New Zealand native who has spent much of his career reviewing technology specifically designed for blind people, and who’s worked for major companies that designed it, was happy with his phone in 2007. But he saw how the larger world’s quick adoption of the iPhone was likely to leave people like him behind.

“It was very clear to me that market share was moving quickly to the iPhone, and everybody was talking about the iPhone. And I remember picking up a friend’s iPhone and feeling this blank piece of glass, essentially, with just a button on the bottom of the glass, and thinking ‘man, we are going to be locked out of this thing and it’s a real concern.’” Mosen says.

Why were people like Mosen so far sighted – pun intended – about how important the iPhone would become, and why they needed access to it, too? Some of it came from their own hard experience with technology.

“When I was growing up, I did a lot of my work in Braille, and everyone else did theirs in print. I had lots of devices and technology support, but it wasn’t easy to integrate that with what other kids were doing. And so I was always sort of off by myself, in my own little corner, literally and figuratively, and I always felt really excluded,” says Steve Sawczyn. He’s an accessibility consultant in Minneapolis.

In the early 2000s, Sawczyn was living in Maine, and got worried when he heard that the state was planning to provide a laptop computer to every middle school student. What would it mean for blind kids if the state chose a computer that didn’t have a screen reader – software that speaks what’s on the screen, out loud? And there was reason to worry, because Apple was a leading bidder for the contract, and there was no screen reading software available on Apple’s Mac computers, then. If blind students couldn’t use state-supplied computers in school, Sawczyn feared they would end up facing the kind of classroom isolation he remembers – probably using different devices than their peers, maybe even learning less.

Sawczyn spoke out, even testifying before legislators about the risks of adopting technology blind kids couldn’t use. He says Apple reached out to him to let him know that something new was coming.

“They introduced me to a thing they were developing which became VoiceOver… But they allowed me to start playing with it, and other folks as well,” he says.

Apple won the Maine school contract, and VoiceOver-equipped laptops eventually ended up in classrooms. And Sawczyn got himself a Mac – becoming a fan of the device, and joining a small community of Mac-using blind people who kept in touch online, and even recorded podcasts.

Apple’s approach to accessibility was unique – starting in 2005, the VoiceOver screen reader was included in the software that came with every new Mac. If you were blind and wanted to use Windows, you needed to buy screen reader software.

JAWS was, and is, the leading screen reader software for Windows, and it could cost $1,000 or more.

So while Windows still dominated among blind computer users, as it did in the wider world, the Mac, which had been completely inaccessible a few years before, had a toehold in the blindness community. And that was something of an accident. James Dempsey, who was a software developer at Apple for 15 years, spent part of his career working on software to make the Mac accessible. He says Apple had hoped someone else would make a Mac screen reader.

“They did that thinking a third party would write the screen reader for Mac OS 10, and then when really nobody picked up that mantle to write the screen reader as a third party, Apple stepped in and developed VoiceOver,” Dempsey says.

Mosen says the company Apple hoped would create a Mac screen reader was Freedom Scientific, which made, the leading Windows screen reader software, JAWS. Perceiving that JAWS for Mac wouldn’t have much of a market, Freedom Scientific decided against making a Mac screen reader.

But that left Apple with a problem. If the company wanted to sell Macs to schools and colleges, it needed a screen reader.

“Apple didn’t develop VoiceOver for Mac out of the goodness of their hearts. They developed VOiceOver for Mac because if they didn’t they were going to be in serious trouble with their key market, which was education,” Mosen says.

Computers were critical in schools, but in 2007, mobile devices were not. I don’t know whether Apple planned all along for the iPhone to have a screen reader – some day – but my educated guess is that they felt they could afford to wait, even if doing so would be frustrating and hurtful to those who had so recently embraced the Mac and its built-in screen reader.

No iPhone Screen Reader

When the iPhone made its debut with no screen reader or other features for blind users, Steve Sawczyn felt a lot of what he had before the Mac got a screen reader.

“I was sad because I felt, ‘Oh here’s another time we’re going to be left out. Eventually, someone’s going to come along and make a special blindness-specific iDevice. It’ll be three versions old. It’ll cost four times as much, and we’ll just keep buying it, cuz it’s the only option that we have,’” Sawczyn says.

Josh de Lioncourt, who lost his sight at age six, had used an Apple II computer as a kid, and was among those who tried a Mac when VoiceOver came along. He more than tried. He became a fan! The iPhone launch left him disappointed.

“I thought that sounds like the most incredible thing I’ve heard of in my life, and I’m sad that I’ll never get to use it,” de Lioncourt says.

And on an episode of the Maccast podcast, Shane Jackson reminded listeners that aesthetics matter, even if you can’t see.

“It was heartbreaking to me. I actually held someone’s iPhone in my hand, and it was the most beautiful, sexy, sleek device… it was just flat. I mean I don’t know how else to describe it as a blind person. It was just flat! And then this beautiful backing and curve. Oh man, if it could just say one thing to me. But it couldn’t,” Jackson said.

I felt that way, too. I had been writing about Apple products as a journalist, for years. I’m not a screen reader user, but I do have low vision, which means I increase the size of text, and change the way my screen background looks to make it easier for me to see. But when I witnessed Steve Jobs unveiling the iPhone – from the audience that day – I wondered whether I would be able to use one at all. And when I borrowed a friend’s phone that first week after they went on sale, it was confirmed – the screen was too small, the background too bright, and the text too tiny. For the first time in 20 years, Apple had built a product I couldn’t use. I’m fairly sure I cried about that.

It’s not as if there were no phone options in 2007. You could get phones with a physical keyboard from Samsung, Nokia or Blackberry. And a lot of people still had flip phones, myself included. A blind person could buy a simple phone and stick to a few of its features – the ones that didn’t depend on seeing a menu. Basically, you could make and receive phone calls, and maybe text if your number keypad skills were good. Or, if you wanted more, you could buy a Windows Mobile phone. To that, you needed to add screen reader software from one of two companies.

Cara Quinn used a screen reader made by Code Factory on her Windows mobile phone.

“It was Mobile Speak… which was fantastic for what it did. but you had to pay more than the price of the phone itself just to get that phone to talk,” Quinn says.

The cost of screen reader software, and a phone to run it on, often approached the $500 price tag of the first iPhone. So skipping the Apple Store line wasn’t necessarily a cost savings.

And screen reader phones, unlike the iPhone with its built-in GPS receiver, couldn’t help you get around as a blind person. For that, you could buy a specialized GPS device. It was heavy and not the least bit cool-looking.

“The original Trekker was this thing – but you had to wear it – it was like a school science project thing,” Sawczyn says. “It was this strap thing that you wore around your neck, and the front of it had a cradle thing that would hold the little Palm Pilot. And then there was the GPS thing on the back of the neck, and cables ran through. But you had to suit up!”

It was better than nothing, for many, but it was also a reminder that living with a disability often means having fewer choices than others have, or doing things in ways that cost extra money, time and back strain, too.

It would be wrong to say that the blind community – even the tech savviest among them – were clamoring as one for an accessible iPhone, in 2007. Some people didn’t believe the Mac’s screen reader was as good as familiar Windows options. And didn’t the lack of accessibility on the iPhone prove that Apple wasn’t interested in people with disabilities?

“You could reach the people that made these screen readers,” Mosen says. “I was on sort of a first name basis with the developers of both Talks and MobileSpeak, and I could write to them and let them know about a bug, or a feature that I thought was lacking, and quite often it would quickly get implemented. So that was a good thing.”

Mosen says the small size of the market for blindness tech meant that the people who made the software were more responsive than he thought a big company like Apple would be.

de Lioncourt got into his share of online arguments with people who didn’t like Apple or its screen reader.

“A lot of these people had spent small fortunes on screen reading software for Windows, and they didn’t like the idea of ‘oh maybe I didn’t have to do that. I can just buy this computer that just talks right out of the box.’ And kind of wanted to justify their purchase,” de Lioncourt says.

In 2008, people who were hoping for more accessibility from Apple had a few reasons to be optimistic. But there was still a lot to be done.

In the spring, the iTunes app on the Mac finally became accessible to blind users – meaning that the VoiceOver screen reader could now speak the names of songs and menus. Wait. Wasn’t the Mac already accessible? Well, even though VoiceOver could navigate the computer and many apps, some older ones, like iTunes, were not – even with VoiceOver. Now, Apple finally fixed that. And the iPod nano, a tiny music player with a scroll wheel, not a touch screen, got its own, much simplified version of VoiceOver.

“When you synced your iPod from your computer, it made up audio files of all the things that it would ever need to say,” says Darcy Burnard.

Burnard had been an early Mac adopter, struggling with iTunes and teaching himself to make the most of VoiceOver. He says that even the iPod’s addition of something like a screen reader gave him hope there would someday be accessible iPhone. As it happened, Apple launched the iPhone App Store that year too – a move that would have important consequences when and if the iPhone ever got a screen reader.

Something Big Is Coming

Like baby animals and flowering plants, Apple rumors come to life in the spring. In that season of 2009, guesses and some actual leaks about what Apple would announce at its annual developers conference in June, began to appear online.

“There was a little bit of a leak. Somebody had said they had seen reference made to VoiceOver in the code,” says Quinn.

She had become Mac user. And she knew that the name of the Mac screen reader popping up in code for the iPhone’s software meant something big could be in the works.

Mindful of how long it had taken for iTunes to become accessible, de Lioncourt tried to keep things in perspective.

“We were hopeful, but not particularly optimistic that we would have access to the whole thing. But maybe this would be the beginning, right. We’d get some access to what the iPhone had to offer,” says de Lioncourt.

And then came June 8 2009.

First, I need to tell you a bit about how Apple announces things, and how people tend to respond to those announcements. You may have heard about Apple keynotes. The company’s CEO – Steve Jobs, when the iPhone was launched, Tim Cook, now – leads a live event that includes highly-produced videos, bragging about the company’s sales, and lots of product announcements. In 2009, Steve Jobs was on medical leave, and so it was Senior Marketing VP Phil Schiller who opened the show. We’ll hear from him shortly.

These days, Apple keynotes are live-streamed as they happen, so even if you’re not there, you can watch from your phone or computer. In 2009, there was no official stream, though journalists in the audience broadcast video from their laptops, offering the stream online.

And that’s how the Apple enthusiast community was arrayed on June 8, 2009 – huddled over their own computers, hoping the unofficial stream of the developers’ conference keynote would keep working.

Several people I talked to about this day told me there had been more inklings that something interesting for accessibility was coming to the iPhone. Besides the code leak in the spring, a few people got cryptic emails, days before the keynote, suggesting they might want to find a way to tune in.

The event was long! There were new Mac laptops, new software for the Macs and a new phone called the iPhone 3GS. And at about four minutes before the two hour mark, in the midst of a long list of new apps to be included on the iPhone 3GS,

Phil Schiller switched slides, revealing an iPhone settings screen. The slide remained visible for the next 36 seconds, as he spoke.

We also care a great deal about accessibility, helping more and more people be able to use this great technology. And some great new accessibility settings in the iPhone 3GS. You go to the Accessibility settings area, and you find there’s VoiceOver. So if somebody needs to hear the, what they’re touching with their finger for an email or web page, it’ll read it to them. If you want to zoom into the display, to have larger icons, you can do that. If you want to invert the colors, if that helps your sight, that’s better. We even can pipe mono audio through both, or either sides of the headphone to help you if that helps with your hearing. So great features for accessibility.

And it was over. No demo. Not even a pause for applause.

In the pre-official stream era of 2009, the first records of Apple keynotes were the liveblogs, usually kept by media people in the hall. At one hour, fifty six minutes, a few liveblogs from mainstream tech sites – including Macworld and Engadget – dutifully transcribed the mention of accessibility features like VoiceOver and Mono Audio support. CNET and The Mac Observer skipped accessibility entirely, going straight from the new Compass app, at one hour fifty five, to the Nike Plus app, at one hour fifty seven.

But among the people for whom VoiceOver meant the difference between being able to use the phone, and not being able to use it at all, reactions were a little different.

“VoiceOver is on the iPhone. They did it. They did it. They did it.”

“Here in one, one day, in one fell swoop, they’ve changed everything.”

“Like on Twitter I said ‘I have no words, and it’s it’s it’s just huge, like you know, not only is it a completely accessible phone but it’s just a completely new platform. I mean people were saying it was never gonna happen and it did. People were saying yesterday it was never gonna happen.”

Cara Quinn, Josh de Lioncourt, Holly Anderson and Darcy Burnard recorded a podcast on the day accessibility came to the iPhone. They reacted just minutes after the keynote wrapped up.

Quinn remembers that feelings that day ran deep.

“Shock and awe. Amazement. I was very emotional. That whole day was spent for me being very emotional. It makes me emotional just thinking about it now,” she says.

On Twitter, too, people were excitedly debating the meaning of Schiller’s somewhat awkward 36-second feature recitation. Would VoiceOver do more than let you make a phone call? How about reading email or text messages? Or taking pictures? Or using apps not made by Apple? Would there be games maybe? How would zoom and invert colors work?

“VoiceOver being a feature on the 3GS was almost glossed over,” de Lioncourt says. “They didn’t go into any great detail about it. And I remember that same day, they had this huge update on the accessibility section of Apple dot com that basically explained how VoiceOver was going to work on the phone. So we didn’t have to wait long.”

de Lioncourt was refreshing Apple’s site, watching as updates rolled in. It quickly became clear to him that accessibility meant that the phone’s home screen, basic features and – crucially – apps from Apple and others – would work with VoiceOver.

I recently had the chance to ask what that day was like inside Apple.

“A lot of excitement. I think, you know, in doing something that was so new, wanting to make sure people understood what the goal was, and that they would embrace it,” says Sarah Herrlinger. She’s director of global accessibility policy and initiatives for the company. She has been at Apple for 13 years. She says positive responses to the iPhone accessibility announcement came quickly.

“We really had wanted to do something extraordinary and we were super thankful that the community really did embrace it,” Herrlinger says.

But there was one more hurdle for those eager to get their hands on an iPhone – it was a long 11 days between the announcement, and June 19, 2009, the first day you could buy an accessible iPhone in the United States.

“I spent those two weeks figuring out, ‘how can I do this? How can I make it work? Do I have enough money?’ de Lioncourt says.

Scrambling to put together $500, breaking cell phone contracts and, in some cases, switching to AT&T, the only U.S. carrier where you could use the iPhone – at least the waiting gave people time to read up on this new VoiceOver thing.

Quinn burned off her nervous energy by started an email discussion list for blind iPhone users called v-iphone. It’s still active today.

“Literally, I didn’t even wait to get the phone. I was like, Josh, I want the premier iPhone list,” Quinn says.

When Everything Changed

And then it was Friday the 19th, and people had their phones, just in time for weekend marathons of trying gestures, attempting to type on a piece of glass, and learning how apps worked, all without benefit of vision. Steve Sawczyn hadn’t even planned to get one at first. But curiosity got the better of him.

“I went to the AT&T store and I bought myself an iPhone, and I was so mesmerized,” Sawczyn says. “I was able to do this at the same time as other people were buying their phones. I didn’t have to wait for a new version of software to come out, or an update to be made, or someone sighted to help me. I could just go to the AT&T Store, buy my device, go home, plug it in, and with iTunes, I could start up VoiceOver and the thing just worked great.”

Ah, iTunes, which you needed on your computer in order to set up your iPhone, and which had conveniently become accessible the previous year. The new iPhone owners learned both from the user manual Apple provided online, and from each other. Many I talked to say it wasn’t hard to learn to use taps and flicks and multi-finger gestures. But it wasn’t like anything they’d ever done before.

“It was such a different world,” de Lioncourt says. “We were learning not just a new screen reader, but a device with a type of UI we had never experience before. So it was trying to get your head around how do touch screens work at all, and how does the screen reader work on top of that?”

de Lioncourt, who has developed iPhone apps, uses terms like UI. It stands for User Interface – the look and, in his case, the sound and feel of using the phone.

Once people learned the basics of making calls, texting and writing emails, they started downloading apps – all kinds of apps. Instapaper, ooTunes, Pocket Yoga, Twitterific and Purr were just a few that new iPhone users told me they got right away. Purr, in case you’re wondering, did just that when you tapped – or petted – the screen.

Like software on the Mac computer, some apps were accessible, some were not. If an app didn’t work with VoiceOver, you’d hear something like

If you bought an app and it turned out not to work with VoiceOver, you were out the money you’d paid. As the Internet has done since the beginning of online time, communities formed to praise good apps and offer warnings about bad ones.

Holly Anderson couldn’t get AT&T cell service at her home in rural Tennessee, so she found another way to be involved in the community of VoiceOver users.

“I was like, ‘i can’t have a phone, so maybe I’ll just started making a list of apps that are accessible.’ Cuz I was thinking, maybe there would be a few here and there. It quickly got way out of hand,” Anderson says.

When the iPod Touch became accessible to VoiceOver later that year, Anderson got one.

People with the means to buy an iPhone remember 2009 as an exciting, transformative time – but you wouldn’t have known that if you read most of the mainstream press. Accessibility was rarely mentioned, even in the tech press. New York Times columnist David Pogue, [writing on his blog, spared a few words for VoiceOver, and a few tech podcasts, including Mac OS Ken, Maccast, and Macworld’s Chris Breen, who would go on to cover accessibility periodically in the magazine, got plugged in, inviting blind or low vision iPhone owners to explain why this was such a big deal.

In the larger blindness community, there was skepticism – some of which seemed very healthy, given Apple’s accessibility missteps of the past, and the company’s frequent unwillingness to let people in on its plans.

“The fact that VoiceOver was just mentioned in that fleeting way gave me cause for concern,” Jonathan Mosen says. “And one of the things I was really worried about at the time was, is this 3GS version of VoiceOver just going to sit there and kind of languish for years and years and years to come, so they can go to purchasers and government entities and say ‘yeah, we’ve got a screen reader. We introduced it in 2009.’”

Earning Trust

For Apple skeptics like Mosen, and those who simply weren’t interested enough to try VoiceOver, the proof that the company was serious about accessibility came in 2010, when the next phone, and an updated version of its software, iOS 4 was released. It included the ability to connect a Bluetooth keyboard or Braille device to the phone, and a more efficient method of typing onscreen with VoiceOver.

“When they did that, what that said to me was, this kind of feature set must clearly be because they’re listening to what users said they wanted, because these are just really obvious things,” Mosen says. “And at that point, I did think, all right, my skepticism was perhaps not warranted. This is evolving nicely. And it was at that point, when iOS 4 came out, that I made an iPhone my primary device.”

2010 was also the year the iPad made its debut, and Apple began selling electronic books.

“The big thing on the iPad was when iBooks happened. That opened up a whole level of possibilities as far as – these are books we can get. They’re not specifically accessible,” Burnard says.

Because the iBooks app was accessible, a VoiceOver user, or a teacher working with blind students, could get any book Apple offered, and hear it read aloud on the iPad.

By then, the mainstream tech gadget that people with disabilities had once been unable to use at all, was entering the disability mainstream. And people’s expectations increased as the device became more central to their lives and work.

In fact, people were making apps that were specifically designed for VoiceOver users and people with low vision. They ranged from navigation apps that gave detailed walking directions and points of interest, to tools that used the camera to identify objects, speak their colors, and scan text,

“You know, when you can bring up an app, and you can show it something in print, and almost immediately, it will tell you what it is, or tell you the lights are on, or the lights are off. And this is all coming from a device you hold in you hand, that ten years ago was not possible,” Quinn says.

Quinn wrote a GPS app to help her navigate during desert hikes. When a company offered to purchase her app, she began a new career.

“The accessibility of the iPhone changed my life, because now I’m working as a professional software developer,” Quinn says.

As more people adopted the iPhone, expecting to be able to use any app they could download, the problem of inaccessible apps grew more noticeable. The availability of the iBooks app from Apple, for example, shined a bright light on Amazon Kindle’s reading app, which didn’t work with VoiceOver until 2013.

What did it take to make an app VoiceOver friendly? Marco Marco Arment, created Instapaper, and currently develops the Overcast podcast player,

“You know, I was working on Instapaper, and I got a report from somebody once that said ‘hey, if you use that under VoiceOver, there are these four buttons that aren’t labeled or something like that.. ‘ And that’s when I started realizing, ‘oh, it’s this whole different type of using the app that I haven’t considered,’” Arment says.

Instapaper, an app you use to save articles from the web for later reading, was a natural fit for VoiceOver users. de Lioncourt says it’s the first non-Apple app he installed on his new iPhone. It was mine, too.

What Arment had discovered was that without his doing anything, the app worked with VoiceOver. But that he needed to do a little work to ensure that there were no blind spots.

“You could get about sort of 80-85 percent of the way with very little effort. The important thing was realizing that accessibility is important, and getting into the mindset of applying it, rather than it being any kind of technical challenge,” says Matt Gemmell, another software developer who took an accidental interest in VoiceOver.

“I wasn’t particularly conscious of accessibility at all,” Gemmell says. “And then when I was, I think 30 or 31 years old, I had a, a vision-related scare.”

Gemmell’s doctors thought he had the type of macular degeneration that usually impacts people as they age. Afraid of what vision loss could mean for his life and work, Gemmell tried using an iPhone without looking at it – with only VoiceOver to guide him. Learning that he could still navigate the phone helped him process what he thought was going to happen to him. So he started writing accessible apps, and creating components he shared freely with other developers.

“I can’t name apps offhand, but for a number of years, there, whenever you opened the About box or pane in an app, you know, more likely than not, my name was in there someplace,” Gemmell says.

Gemmell no longer makes software – he’s now writing novels, pursuing a lifelong dream. But he pops up online, still interested in VoiceOver, and accessibility generally. And it turned out that he didn’t lose his vision.

Not Just VoiceOver

As time passed, the iPhone, iPad and their software grew and changed. New accessibility features, like Guided Access, which many teachers use to facilitate learning for kids on autism spectrum, and support for connecting hearing aids to a phone, were added. And in 2013, when iOS 7 was released, features for people with physical disabilities joined the ranks. iOS Accessibility was no longer just about blindness or low vision.

“Switch Control is a feature that allows individuals with very extreme physical motor limitations to be able to use our technology,” says Herrlinger. “So in the same way that Voiceover became a way that you could use a touch screen without having to see the screen, switch control became a way to be able to use the touch screen if you were never actually going to touch the touch screen.”

Herrlinger calls out switch control, which allows you to control an iPhone or iPad with multiple external buttons – or switches – as one of the platform’s most important accessibility innovations, along with Made for iPhone hearing aides, and VoiceOver.

There have been stumbles.

iOS 7, which included a major change to the way the iPhone screens looked, was initially a setback for some people with low vision. The highly-stylized, transparent screens, the thin fonts, and animated app icons sent some accessibility users reeling, and irritated a lot of people with standard vision, too. Apple had added new accessibility options in iOS 7, probably expecting that they would represent a step forward. But the company had to beef up those features to address the problems low vision users were having.

“And it wasn’t until iOS 7 that added a lot of these visual preferences that a lot of people actually used out of preference…It started getting on developers’ radar. Oh, there’s this whole section of settings over here, called Accessibility, that change the way my app looks or works, and I need to make sure it doesn’t break under those settings,” Arment says.

The number of inaccessible, or partially-accessible apps that remained available in the App Store suggests that Arment is more conscientious than some software makers when it comes to support for VoiceOver and dynamic type – a feature that lets the user adjust text size onscreen. But he is far from the only one who has worked hard to make accessible software. Arment thinks Apple could help.

“I think testing for VoiceOver, dynamic type, and the other basic accessibility features, should be part of App Review,”Arment says.

When a developer submits software to be added to the Apple App Store, it must pass App Review before it goes on sale. Arment is not alone in advocating a requirement, or at least, a disclosure about whether an app is VoiceOver friendly. Blindness organizations have advocated, and even considered suing Apple to force third-party app accessibility.

Accessibility Leadership

By 2016, Apple was secure enough it its place as a leader in accessibility that it led off an important product launch event with a video illustrating the ways people with al kinds of disabilities were using its products. Like much of the theater Apple uses to promote itself, the high profile for accessibility then, and at subsequent events, was designed to instill warm feelings about the company among the largely non disabled audience. But for people who use the accessibility features of their iPhones and iPads every day, the hype was usually backed up by substance.

“I think that trust and confidence should be earned,” Mosen says. “And over the years, while there have been bugs and glitches where VoiceOver is concerned, they have certainly earned my trust and my confidence. I think they undisputedly provide the best mobile accessibility experience by a long way. Android’s not even close.”

It’s also possible that Apple’s push to build accessibility into its industry-leading tech gadgets had an effect on other companies building devices for a mainstream audiences.

“Now I have a talking DirectTV DVR, and I think if it wasn’t for Apple showing people that it was possible for mainstream products to promote accessibility, I don’t know if it would be as prominent as it is,” Holly Anderson says.

It would be difficult to prove that there’s a direct line between the first iPhone that could speak in 2009, and a DVR that talks today. But Apple has clearly had an impact.

This year, Microsoft, Google en Apple have each held conferences for people who develop software for their platforms. During splashy keynote events, each company’s leaders talked about accessibility, both in terms of new features for people who need it, and as examples of how advanced technology is being applied in their products. And though there’s a long way to go before accessibility works for every user at all times on all devices, the topic is no longer relegated to a the tail end of a long event. And the people who speak about lowering barriers to all users are a lot less tentative, or imprecise than Phil Schiller was in 2009.

36 Seconds That Changed Everything was written and produced by Shelly Brisbin, with audio production and mixing by Patrick Perdue, and music by Andre Louis.

Special thanks to: Holly Anderson, Marco Arment, Darcy Burnard, Adam Christianson, Josh de Lioncourt, James Dempsey, Joy Diaz, Anna Dresner, Matt Gemmell, Sarah Herrlinger, Lori Lodes, Jonathan Mosen, David Pogue, Cara Quinn and Steve Sawczyn.

This program is copyright 2019 by Shelly Brisbin. All rights reserved.