There are no practical applications yet for AR—your mum isn’t using it to navigate Tescos, for example—but AR has become wildly popular in the tech community. Slap AR on a pitch and get some VC funding. Or, slap AR on the side of a product and bask in the AR buzz from tech publications. Bose, a company known for nice headphones and not nice user interfaces or computers, is the latest to embrace AR, recently announcing a plan to fund AR startups through the new Bose Ventures, but more importantly, it announced a platform that includes AR glasses and, Bose hopes, a new way to interact with AR content—and thus your world.
You’ll have to forgive me for needing a minute to wrap my head around this. Bose is known for making some of the best active noise-cancelling headphones on the market, and currently, our favourite truly wireless earbuds. It is not known for its work in augmented reality, apps, or user interface design. And those are the three cruxes of its new platform.
The platform will allow app developers to access the guts of Bose’s AR devices and use Bose’s new user interface, which is completely hands-free and relies on your voice and head tilts. Yelp, TuneIn, Strava and TripAdvisor are all partners developing for the new AR platform. Bose has decided to announce the platform and hardware at SXSW, a conference not known for its big hardware news, to drum up interest from other developers.
Sadly the developer kit glasses will be tinted only. People with prescriptions are SOL.
At this time Bose is making the platform available on a pair of as yet unnamed headphones, and sunglasses that use tiny speakers to pipe sound into your ears without letting the rest of the world listen in. The glasses also connect via Bluetooth to your phone, and have motion sensors built in.
According to John Gordon, vice president of consumer electronics at Bose, this combination of sensors should enable the glasses (or headphones, or future helmets, goggles, and ski masks) to know precisely where you’re looking without the need of a visual aid. He points to the example of standing on the Seine in Paris, where you can look in one direction and see Notre Dame and turn you head a little to see the Louvre. But that is, for the most part, a fairly mundane example. The GPS on your phone already does a good job of knowing where you’re looking when it comes to such large examples as cathedrals and palaces.
What about signs or something smaller, like looking at paintings in a gallery? Gordon tells Gizmodo the glasses aren’t quite there yet. “We expect that to keep getting better over time.”
Instead, the platform will allow you to do a workout with Strava while listening to music over Bluetooth—something Bose and Jaybird already do with fitness headphones. Only, Gordon is quick to say that this isn’t just voice control, which digital assistants have been doing, with varying degrees of success, for a few years now.
“It’s not just the voice side. It’s the voice and the head movements that now enable you to do something as transformative as swiping and scrolling on a smartphone.” As far as Gordon is concerned, “this is a whole new interaction pattern for a different type of interface.”
And he’s not wrong. Amazon is reportedly working on similar AR glasses, and Vuzix showed of its Alexa-powered glasses at CES. Intel announced AR glasses with a similar user interface last month: a combination of head movements and voice control.
As AR becomes more popular, hardware makers are going to need to think long and hard on how we should interact with these new systems. Voice control—itself still in its infancy as far as user interface design is concerned—won’t be sufficient. The world is never going to be filled with billions of people wearing glasses and shouting “skip track” on the subway to move to the next song on their playlist.
It’s unlikely people are going to run around wearing a glove or holding their hands in front of their faces to interact with these new types of computers—no matter what Iron Man 3 and Minority Report have to say on the subject. A new and fundamental change to how we interact with computers will likely be necessary. AR glasses need their own version of a mouse or pinch-and-zoom—the things that radically altered how we interact with desktops and mobile phones respectively.
But I have no idea if Bose has actually succeeded. I haven’t tried out the glasses or headphones, which are currently being demoed in Austin, Texas at SXSW. But we’ll all have a better idea of what Bose is doing—and what it’ll do next—when the developer kit, including a pair of glasses, is available later this year.