Apple former product engineer on building the iPhone experience
Ken Kocienda was part of the team that patented the iPhoneAdd bookmark
Many of the smartphone features we now take for granted like touch screens, apps and auto-correct started with the iPhone, when Steve Jobs asked Apple’s product engineers to investigate the feasibility of developing touch screen-based phone and tablet devices with a proprietary operating system designed for a whole new user experience.
To prototype an unprecedented product from scratch, empathy for the prospective user is paramount.
“When we were designing the iPhone we wanted it to be simple, much simpler than your typical TV remote,” Ken Kocienda, a former product engineer at Apple, said at the Qualtrics X4 Summit in Salt Lake City.
Part of the software team that built the original iPhone, Kocienda was the genius that invented autocorrect, “so you can thank me for all your ducking autocorrections,” he joked, referring to the keyboard’s stubborn censorship feature.
The first iPhone debuts at MacWorld 2007 (Photo credit: Wikimedia Commons)
Jobs had a firm vision for the iPhone’s hardware – a pocket-sized, multi-touch computer – but the engineering team needed to design software to support it. They called on the human interface team to help – a group of artists and designers who mostly used Photoshop and Illustrator to make animations and draw graphics – skills which allowed them to think conceptually about products.
“This team thought about people and they came up with two concrete and specific ideas for the iPhone and touchscreen computers,” said Kocienda.
The first innovation was inertial scrolling, a feature we now consider second nature that allows users to scroll by gliding their fingers across the display. “It glides faster when you slide more quickly and at the bottom it bounces playfully to tell you that you’re at the end,” explained Kocienda, author of Creative Selection: Inside Apple’s Design Process During the Golden Age of Steve Jobs.
The second bright idea was to design a home screen – “taking this grid of icons, making it your homebase for using the product and that each of these icons represented an app and when you tapped on an icon it filled the screen.”
It’s amusing to hear Kocienda describe these features in such elemental terms, but at the time they were considered an unprecedented feat in software engineering, and the iPhone App Store revolutionized the way we create and distribute software. These apps are what allowed the smartphone to become a stand-in for cameras, flashlights, calculators, clocks, scanners and iPods.
The goal was to design an “intuitive” home screen – but what exactly does that mean? “I know it when I see it” is how many people define an intuitive user interface, but Kocienda said it really boils down to understanding what people already know and then designing an obvious pathway for them to react.
In the early 2000s before the first iPhone launched in 2007, most people understood icons from having used personal computers and early iterations of the Macintosh – they each represented an application and you clicked on them to launch a program.
“Our goal was we needed to bridge this intuitive gap; building on what people already knew and giving them this new experience using touch,” said Kocienda.
Because a home screen had never been built before, the next question was: how big should the icons be? Initially, the team toyed with extremes – going from tiny icons where 15 apps could be displayed at once, discovering those icons were too small and shrinking it to six icons at a time, which looked clunky and rather like a child’s toy computer.
“We had an apprehension in the early stage of the project because as we were going and touching the icons on the screen your finger covered up the touch target, so that if the icon was about as big as your fingertip you’d lose track of it right in that critical moment where you’re about to touch the screen.”
The software engineer in charge of the home screen designed a game not unlike Whack-A-Mole where you tapped a square icon and then another would appear in a different part of the screen. The object of the game was to tap 20 icons in as little time as possible. Through the game, the software engineers discovered the optimal icon size.
“Since the game was fun and we were all playing it we discovered that if we could make this icon 57 pixels square, everybody could tap it comfortably no matter where it was on the screen,” Kocienda explained.
Before the iPhone launched at the 2007 MacWorld to a firestorm of media attention, Apple’s biggest competitor in the smartphone category was the BlackBerry, then the smartphone of choice for businessmen and women who needed to access their emails remotely.
However, from the very beginning, Apple decided that the iPhone would not have a keyboard – touchscreen keyboards represented another stretch of uncharted territory.
The BlackBerry Bold (Source: Wikipedia)
“The keyboard was going to need to be in software so that it could get out of the way to create a bigger canvas for apps and experiences other than typing with a more flexible user interface,” said Kocienda.
This feature of the iPhone turned out to be its biggest sticking point in the design process. Weeks passed and the engineers rolled out new iterations for the leadership team to trial, but they made little headway.
“All of the engineers, about 20 of us, were called out into the hallway and we were told to stop what we were doing,” Kocienda recalled. “Stop work on the homescreen, inertial scrolling, the browser, the phone app and every feature of the system...Everybody now is going to be a keyboard engineer because we need to crack this problem or we might not have a product.”
Leadership suggested they reconvene in a few weeks and host a keyboard derby, “sort of like a bake-off,” where Apple exec Scott Forstall would try all the keyboards.
Kocienda went through numerous iterations, including one he called the “snowman keyboard,” where each key was shaped like a snowman to give it a larger surface area that would nullify errant keystrokes. His next version squeezed three letters into each key.
“It was a gesture keyboard with three-way gestures for each of those keys,” he explained. You would tap for certain letters, swipe for others, but it was too hard for users to remember which letters to tap and which to swipe.
Kocienda scrapped all of his keyboards and started afresh.
“If you do get stuck in the midst of a creative or technical project, sometimes you just need to throw away your work,” he advised. “Don’t get caught up in the sunk cost fallacy of, I’ve already spent so much time on this, I don’t want to throw it away.”
He went back to his initial QWERTY prototypes, which replicated the familiar keyboard layout of desktop and laptop keyboards – building on what the user already knows. The next thing Kocienda did was install a dictionary into the software that would run through combinations of possible words as the user tapped the keys. At the time, however, the software was programmed to suggest only words with the same number of letters you’ve already typed.
“If you get five letters into typing ‘aluminum’ what the software would suggest to you is ‘slimy’ because that’s the most sensible word for those first five keys,” Kocienda recalled with a laugh.
He reiterated the software to recommend longer words so that it better resembled the keyboards we’re familiar with today. When the user selects a key, the individual letter pops up to show it was selected, but the algorithm sees the letters around it and tailors autocorrect suggestions accordingly in case the user meant to tap one of the surrounding letter instead.
“At Apple we knew what it took to create great work and we focused in on every little detail. Every pixel had to be right, no detail was too small.”