Touch technology, it all started with the humble button, and now there’s a revolution in the world of touch input every few years.
Take the Samsung GALAXY Note II for example; it manages to couple its S Pen with 1024 levels of pressure sensitivity. That means professional graphic design tablet style pen input, on a phone!
With Samsung already going beyond touch in its GALAXY S4, using sensors to create Minority Report style gesture interaction, it really does beg the question - is there anywhere left for touch to go and what’s the future of touch control?
Before we look to the future though, it’s worth going back in time.
Switches, Buttons, jog-wheels. When mobile phones first hit our hands, it wasn’t a touchscreen we were touching, but clickable, physical buttons. The Motorola Dynatac for example - also known as the “Saved by the Bell” Zack Morris phone! - had plenty of buttons and, in its original form, no screen to speak of.
Things moved on from physical buttons to touchscreen though, with the IBM Simon in 1993, the first touch phone and, according to most, the very first smartphone. These early touchscreens weren’t what we’ve come to know on modern day smartphones though; instead they used something called resistive technology.
Resistive screens require pressure when interacting with them – you have to press, and, sometimes, press pretty hard to register an action. Phones and devices with resistive screens generally came bundled with a stylus – a small, usually plastic pen device.
Why? Because it isn’t particularly comfortable swiping on a resistive screen with a finger as pressure needs to be applied throughout the swipe. Plus, given the fact resistive screens have to bend slightly, glass also can’t be used, so manufacturers had to make do with plastic. While these early examples of the tech were called touchscreens, they weren’t much fun to touch, with the plastic scratching easily and not feeling rich.
It’s therefore little wonder that capacitive screens took off.
Capacitive screens use the body’s electric conducting properties to acknowledge a finger press. To the uninitiated this may sound like science fiction, but it results in an ultimately comfortable touch screen experience – and on glass no less.
While pleasing 99% of users, capacitive touchscreens still fell behind resistive screens in one area and one area alone – precision.
The finger isn’t mightier than the pen in this respect. With resistive screens working with pressure, they were very, very precise when used with a pointed object like a stylus.
Capacitive screens, in contrast, wouldn’t work with pen input. Why? Because they require a distortion in the screen’s electrostatic field. A finger can do this, but as plastic doesn’t conduct electricity, it can’t.
It took some out-of-the-box thinking to take touch control to the next level, and with their original Samsung GALAXY Note, Samsung prevailed.
The GALAXY Note coupled a capacitive screen with a Wacom layer, called a digitizer. It recognised when the accompanying S Pen was touching the screen with pixel-precision, making for an unprecedented pen-input experience on a mobile touch device.
In the Samsung GALAXY Note II, Note 10.1 and new Note 8.0, we can now experience the zenith of touch input to date. With an incredible 1024 levels of pressure sensitivity, Samsung has blossomed its relationship with Wacom to produce an incredible, err, fruit.
Bizarre fruit analogies aside, the result is a truly complete pen-input experience that comes considerably closer to pen and paper than ever before, with the many advantages of being digital.
So what’s coming next in the world of touch control?
No touch needed, that’s what. Well, it’s actually here already. What are we even talking about? The Samsung GALAXY S4 of course.
With a total of nine sensors on the skinny, 7.9mm smartphone, it detects everything from ambient lighting through to air pressure.
These sensors even know when your hand is hovering above your phone and when your eyes are watching it, and with software on board taking full advantage of this hovering and watching, cutting edge touchscreens have never been so touch free.
Air View on the GALAXY S4, for example, turns a hover over part of the UI into an interaction, previewing gallery folders and emails quickly and easily. Air Gesture, another touchless feature means a simple waft over your phone will scroll through your gallery, perfect for those wet handed moments.
Does all of this mean there’s no room to innovate in the world of actual touch technology? Of course not. Right now and over the next year, this sensor-based interaction will be honed in combination with touch input to create the most immersive, natural experience around.
We’ll also be seeing flexible displays arriving on phones soon enough, once again furthering the scope for touch tech.
How would flexible touch screens work? Imagine if your touch-screen wrapped around the sides of your phone. You wouldn’t need physical buttons like a volume rocker. Instead, a swipe along the edge of your phone could hire and lower the volume.
Your phone could easily create dynamic buttons that are context specific too. When you open your camera app for example, a camera button could appear on the side of your phone, and if the phone is in a case with only the top exposed, a small clock could appear on the exposed area. Using Samsung’s AMOLED technology, such flexible touchscreen use would be incredibly power efficient and undeniably useful.
So that Minority Report panning and swiping style interaction we’ve dreamed of for years may not be such a distant dream after all! Trust the movies to give us a sneak peek into the future. That said, one thing not even Tinseltown’s finest future-gazing directors predicted was that we’d be doing it all on the go, on our slender phones and in the palms of our hands.