How Smartwatches Are Pushing Science to Make it Easier to Read Screens at a Glance

By Kelsey Campbell-Dollaghan on at

How many times a day would you say you check your smartphone? Be honest. For the average person, it’s 150 times a day. And most of those interactions happen in less than a second.

Today, as the first reviews of the Apple Watch rolled in, we got a look at how Apple plans to banish your phone to your pocket and communicate instead through micro-bursts of text. To succeed, it needs to do this with perfect clarity, brevity, and incredibly speed. The success or failure of a whole new generation of devices depends how easy it is to read at a glance.

Fortunately, a small, vocal contingent of scientists are already studying how design affects legibility; their research is shaping the future of interface design, and building the tech to make glances as useful as possible.

The Type Doctors

Dr. Nadine Chahine is a type designer at the foundry Monotype who focuses on the science of legibility. Dr. Bryan Reimer is a scientist at MIT’s AgeLab who researches distracted driving and the impact of in-car interfaces on drivers.

Together, they’re writing the book on how our eyes read when we’re distracted by the world around us. “There literally is the need to develop a new textbook here,” Reimer told me, after he and Chahine gave a talk in March entitled At a Glance: How Does Type Impact Your Daily Life?. “Companies have to come together and support science-infused design.”

The Split-Second Science of How Your Brain Reads a Screen

Chahine and Reimer say that “glances are the new currency”. They’re talking about the constant, nagging micro-distractions that are the key ingredient in the way we interact with technology these days. Even though the smartphone has been around for a decade, there’s been little research into how these micro-interactions occur, and how design can improve them. “We’re not willing to devote more than a fraction of a second to information anymore,” says Reimer. “So how do we redefine information around what we can capture in a glance or two?”

That’s exactly what they’re trying to test, using technology like eye-trackers and simulators that can recreate the distraction most users feel while reading a message on their phone or watch.

Distraction, Quantified

How do you study how people read when they’re distracted? Well, you distract them and watch what happens.

In a study published in the journal Ergonomics last year, Chahine and Reimer put subjects in a realistic driving simulator with a GPS-style screen on the dashboard. While keeping the car steady, drivers had to select a particular address on the screen. Using an eye-tracker, the researchers could measure how long it took the driver to select the correct address while driving.

The Split-Second Science of How Your Brain Reads a Screen


But they weren’t just studying how quickly drivers could read. They were studying whether the design of that text matters, too. They displayed the addresses in two different typefaces. One was called Eurostile. It’s a typeface you’d probably think looks vaguely digital, with letters that fit into square shapes. The other was Frutiger, a rounder typeface with more open letters; it was designed in the 1970s and designed to make airport signage easier to read at Charles de Gaulle airport.

The Split-Second Science of How Your Brain Reads a Screen

You’d probably expect the “digital-looking” typeface to be easier to read on a screen. But they found something surprising: for male subjects, it took 10.6 per cent less time to “glance” at the Frutiger text than the Eurostile, which meant nearly a half-second less distraction from the road. Oddly enough, the typeface different didn’t seem to matter as much for women, though Frutiger did help when they were faced with driving in dark conditions.

The Split-Second Science of How Your Brain Reads a Screen

Beyond proving just how much typeface affects how quickly we read something while we’re distracted, it also proved how little we know about the science of typography. The difference between men and women was totally unexpected: no one ever thought to look for something like a gender difference.

Maybe in the future, your screen’s UI could make micro-adjustments not just for darkness or distance from the screen, but also for things like physiology.

Tinier Screens, More Problems

Apple reportedly developed its Watch because we all look at our phones way too often. It says it will fix this by delivering only the important info through a series of what Apple calls “Glances,” the exact word that Chahine and Reimer have been using to describe the way users interact with screens for two years. “It really exciting that our research push is being validated by what is actually going on in the market right now,” says Chahine.

But devices with smaller, less obtrusive screens are a deceptively simple solution to a problem that’s more complicated. Even if it’s on your wrist, we’re still talking about reading words on a screen. A tiny screen. For milliseconds. While you’re walking. Or biking. Or talking. In a way, it’s even more fraught than looking at your phone.

The Split-Second Science of How Your Brain Reads a Screen

Smartwatches and heads-up displays are totally new territory when it comes to design; one where a device has a millisecond to communicate a vital couple of words or numbers — or fail completely. “What I think many of the designers may not fully appreciate is that if they don’t get through in that glance or two, the volume, velocity, and viscosity of that information is gone,” says Reimer. “The door opens once on this information. If I don’t get it right then and there, it’s over. How do you convert that glance into two or three?”

Even the language that designers use is vital. Chahine explains that research shows the more letters in a word, the slower the reader understands it. If a word appears frequently, the reader will get it faster. The basic semantics of Apple Watch’s “glances,” or any other smartwatch’s notifications, will have a major impact on whether its design meshes with your life the way it claims it will.

The Split-Second Science of How Your Brain Reads a Screen

On the typography side of things, Chahine explains, the spacing between letters becomes incredibly important, since our brains are just as sensitive to the “ghost letters” created by negative space as they are to the actual letters. Reimer chimes in that something as small as a millimetre size different in the font can have a huge impact on legibility, too, that wouldn’t be clear to a designer but that scientific study reveals is really important.

Reimer and Chahine are making all their work available online and they welcome questions and requests from companies that want to apply their discoveries to products. After publishing their initial study last year, they followed up with a second paper, Revealing Differences in Legibility Between Typefaces Using Psychophysical Techniques: Implications for Glance Time and Cognitive Processing. It shows how the same kind of test could be used to test how the typography and graphic design of an interface affect the time it takes a user to read it.

The idea? To allow design studios and companies working on displays to use the same methodology that scientists use to assess how design affects how our brain process information. “It’s a call to action that we need to balance design and science,” Reimer says. “The designers do a wonderful job of presenting information. But can we inform, with a little science, the decisions that they make along the way?”

The Apple Watch and its peers are the very first of what will likely become a new generation of devices that claim to deliver information in split-second glances. It’s a serious design challenge — and one that requires science to be successful.