One of the wonderful things about getting your first iPhone was the sheer self-sufficient simplicity of the thing—here was a device that served as a map, mp3 player, notebook, phone, and anything else you might need, all crunched into a beautiful little package. But if this year's WWDC was any indication, that era of autonomous Apple devices is nearing an end.
A lot has already been said about the graphic design changes made by Jony Ive in iOS 7, from the app icons, to the colors, to the typography itself. But there are plenty of other things to be gleaned from this new iOS. Underneath the cosmetic changes—which look far more radical than they act—we can see the faint outline of a foundation being laid for a new generation of mobile devices.
Parallax looks cool and all, but it’s also just a taste of what’s to come from iOS 7’s cryptically named dynamic motion controls, in the UIKit. Connecting animated UI elements with the accelerometer transforms them into nearly real-world objects, which respond to the gestures and movements of the user.
According to Matt Webb, the founder of London UX studio Berg, we’re seeing the emergence of a UI hybridised with a physics engine. “As flat as the visual design of iOS 7 is, this is where Apple's skeuomorphs have gone: Into the behavior of the UI itself,” he said over email. “There's a whole new design language here, just waiting to be discovered.”
Once developers start to unpack and explore this new language, we'll begin to see all of the ways in which this physics-based UI can be utilised. Some have called parallax a gimmick, but it's really more of an appetizer for the many ways the accelerometer could make the iOS interface adaptable to the world around it. Imagine if your phone knew to switch into a simplified mode when you're on your bike, or in the car. Or if you could shake away notifications. Or if on-screen objects reacted to light, motion, and touch the same way off-screen objects do.
Image via Business Insider
Most of us reached our functional limit for iOS 7 criticisms sometime last week, but there are a few interesting tidbits that may tell us something about the future iPhone 5S or 6. As Philippe Azimzadeh has pointed out, plenty of the new UI details deal with colour. For example, certain graphic elements will change according to the colour of the background behind it. Even the apps themselves have been given their own distinct “mood” colours.
Image by Martin Hajek
All of this hints at the introduction of multi-colored iPhone or iPad bodies, which users could customise their iOS skin to complement. Down the line, the UI colour scheme and hardware colour schemes might be completely interchangeable. In fact, it seems as though Apple is attempting to make every UI element on the homescreen “disappear,” by presenting the control centre text over a semi-transparent lens, and by unifying the colour scheme between navigation and status bar. Only apps themselves will have a consistent palette.
Apple is built upon a very specific design ethos: each product has its own definite hard- and software language and appearance. But personalisation may be the future.
There's been plenty of speculation that the forthcoming iPhone 5s (or 6) will likely incorporate new sensors. Those could include a fingerprint sensor, which would tailor the iOS interface to users based on biological data, or an altitude sensor, which would increase the precision of location-based services and apps. The future iPhone UI could adapt your security settings based on your location—either at home, work, or out in public.
Nick Bilton hinted at this a while back when he interviewed a sensor engineer for The New York Times. "One way to [increase security and privacy on mobile phones with sensors] is to build software that detects how you hold and interact with the device—almost like a motion fingerprint," he wrote. "After you use a new phone for a short period of time, it will start to learn your patterns and automatically lock or unlock the phone accordingly. This could be used for more secure banking too."
We may be about to see that prediction come true. The swipe-anywhere unlock gesture in iOS 7 could be a step towards introducing fingerprint-based security, for example. And the newfound adaptability of the UI—using the accelerometer—paves the way for developers to incorporate other sensor data into UI behaviors, including security based on location.
From Greek gods to modern computers, humans tend to mold their inventions in their own image. Mobile devices are no different: Traditionally they encase a series of systems in a single shell. But that’s quickly changing, since there’s no reason that hardware peripherals need to be sandwiched within the same device. At WWDC, we got a taste of how Apple is opening up iOS to pull data from new hardware platforms—for example, the new iOS 7 car app, which will connect your phone to your in-car dash.
Industrial designer and Gizmodo contributor Don Lehman describes these as “hardware apps,” and explains that this could be where Apple delves into smart watch territory, if they do at all. “One of the biggest points of design for smart watches is battery consumption the more capabilities you add to a device, the more power it consumes,” he says. “But if the device acted as a peripheral to the smartphone, it could offload some of those also allows you to keep cost and size down.”
We’re already beginning to see the emergence of a UI that would support a series of peripherals. On the iOS 7 pull-down control center, for example, we’re given a quick read of what’s up in terms of weather, iCal, and other disparate apps. That functionality could carry over to display info from hardware platforms.
We can’t talk about hardware apps without talking about what’s linking all these peripherals to our phones. “Just like Apple's ‘Digital Hub’ strategy from the last decade," says Webb, "they're putting the pieces in place to become the ‘Internet of Things’ hub too." iOS 7 will support AirDrop, finally enabling iOS users to exchange files between devices without Wi-Fi. On the AirPlay side of things, 9to5 Mac reports that Apple is also planning to open up AirPlay Audio to developers, allowing them to leverage audio on any hardware platform, rather than just certain types of audio devices.
One of the less-discussed features unveiled at WWDC this year was something called the Apple Notification Center Service, which will allow your phone to push notifications to any other Bluetooth-enabled hardware. The ANCS is what will allow your phone to push iCal notifications directly to, say, a Bluetooth-enabled smart watch, or headphones, or wireless speakers. That sounds fairly small, but according to Webb, the ANCS is an early indication that Apple is laying the groundwork for a network of smart devices that are in constant contact.
All of these newly connected devices will need a central iOS 7 command centre, too. Days before WWDC even happened, Webb speculated that Apple will eventually introduce an app to collect all of the hardware peripherals in iOS. He named this imaginary feature “Nightstand,” (analogous to Newstand) and described it as a “virtual table for physical things.” It’s easy to imagine iOS 7’s new control center fulfilling this function, too. Meanwhile, a combination of AirDrop, AirPlay, and Bluetooth will work together to form a “connective tissue of a peripheral ecosystem around smartphones, just as USB was for peripherals around the PC.”
Of course, speculation is just that: speculation. But in many ways, WWDC was the opening salvo to an entirely new kind of Apple product ecology—one that is far more flexible and immersive than anything we’ve seen before. Future iPhones and iPads won’t be phones so much as brains, acting as central processors that suck data from dozens of sensors and hardware apps, communicating through an interface that adapts to the world around it.