The iPhone 8 Will Turn Us All Into Beta Testers For Apple's Next Revolutionary Product

By James O Malley on at

On Tuesday evening Apple is expected to officially unveil its next iPhone. You know the drill by now: Tim Cook and a parade of Apple executives will come on stage, declare it the best iPhone ever and run through the various tweaks and improvements that make this year’s phone better than last year’s. We’re expecting the screen to be bigger, the processor to be faster... Blah blah blah. You know the drill.

So why should I care?

This time, however, even if you’re fatigued by the incremental upgrades, there’s going to be something to get excited about. The release of the iPhone 8 will mean the launch of iOS 11 - the company’s latest iteration of its mobile operating system - and this will add a very cool new set of augmented reality tools that apps can use, called ARKit.

The company first showed it off last June at the Worldwide Developer’s Conference, where it was demonstrated how developers using these tools could make apps that place virtual objects in real situations. Simply by using some very clever software, it is able to detect actual physical surfaces - and can accurately track virtual objects. This means that if you have a virtual coffee cup on your real table, as you move your phone around, ARKit is smart enough to make the cup appear in the same spot on the table - however you move your phone.

Developers have had a few months to play with the technology now - and have produced some demos that I find truly astonishing. Try watching some of these examples from the Twitter account “Made with ARKit”, and try not to be impressed:

Pretty cool, right? Unfortunately until iOS 11 launches properly, only developers have been able to try apps like this. But once the iPhone 8 arrives and iOS 11 rolls out, suddenly, this sort of clever tech will be in the hands of hundreds of millions of people as soon as they hit “Update Software”.

Why is it significant that our phones will be able to do this?

In my view, AR is perhaps the best candidate we have at the moment for the “next big thing” - the next technology that will prove just as transformative as the original iPhone.

You only have to look at the demos above to see hints at how the tech could be useful. Turn by turn navigation superimposed on the actual pavements in front of you. Measuring stuff just by pointing your phone in the right place. It’s easy to imagine other apps which, for example, turn your running route into a race track with distance markers, or an app that will add live bus times to bus stops as you walk past. What about an app that, when it recognises a friend’s face, will automatically flag up that note you made about that thing you needed to discuss with them?

Okay, so you might be a bit cynical. Will we really want to be constantly holding up our phones to do this stuff which - on the surface - isn’t completely life changing? For Apple, whether the technology is popular on iPhone is almost besides the point: the obvious endpoint is a pair of smart glasses.

If this sort of technology was completely seamless, and was simply built onto our faces for most hours of the day, then of course we would use it - and of course we would find it useful.

A virtual tape measure would, indeed, be useful.

Apple - in a sense - doesn’t care about ARKit on the iPhone 8; what it cares more about is using us as beta testers for when it does finally get around to inventing some smart glasses. By having millions of users, it’ll be able to spot and iron out bugs in the fundamental tracking technologies, developers will be incentivised to come up with new and exciting use cases, and consumers will be given a hands-on demonstration of how the technology works.

Haven’t we been here before?

Way back in 2013 Google released the first ever credible attempt at a pair of smart glasses. The promise of Google Glass was that you’d be able to do many of the things you would usually use your phone for: messaging, calls, photography, but with the Glass device strapped to your face.

It’s safe to say that it had a rather… mixed… reception. At the time it invited ridicule because, yes, the people who used it looked ridiculous. And perhaps more substantially, it sparked privacy worries as people became understandably concerned that the “Glassholes” (as they became known) could be photographing or filming them at any time.

Since then, Google pulled the product from general sale and is now busy in the process of reimagining Glass for niche industrial applications rather than something everyone will use.

The failure of Glass though shouldn’t be taken as an indication that smart glasses can never work. Not only is Apple much better at the superficial stuff — making products look cool — but Glass had another fatal flaw: it couldn’t do AR. Effectively, it was a glorified Apple Watch - it was for showing notifications and messages. It couldn’t do anything clever, like we’re seeing the beginnings of with ARKit.

I think AR has the potential to change this conversation on smart glasses. First, it will give them a purpose: Now instead of being a glorified notification machine, having something capable of AR strapped to our faces will be self-evidently useful when we’re navigating and interacting with the world. So there will be a compelling case to wear them.

Secondly, it will rebalance the equation of privacy concerns. Yes, there will still be a camera pointing from our faces, and yes, it will still be equally conceivable that wearers could be filming people (in fact, if AR is to function, the camera being switched on will be essential). But the utility will make us more willing to trade off this privacy, because we get something better because of it. This tends to be how our privacy is encroached by new technologies. You know how some people worry that Facebook knows too much about us? Or that Google has too much of our data? Inevitably most of us just accept the loss of privacy because stay connected with our friends or getting highly relevant search results is just too useful.

So when Apple finally does release its AR glasses, there will be a period of justified hang-wringing, but then everyone will forget and put on a pair of Apple Glasses.

So I’m just testing this out for a future product?

Effectively, yes. Let’s face it - phones are not the optimal way to consume AR content. Most initial uses are likely going to be fairly trivial, or glorified tech demos. But by playing with these apps we’ll be helping Apple - and the other device manufacturers - figure out the nuances and the rules of AR. The companies will be able to use the feedback from millions of people to perfect location tracking, so that virtual objects really do appear glued to their real-world spot.

They will also be able to figure out how we will interact with AR. You know how “pull down to refresh” is pretty much a standardised way of refreshing your emails, or your tweets or whatever other content you’re looking at? The first generation of iPhone apps didn’t use - they just had a refresh button. It was only once developers started experimenting with the new platform that it became a thing.

And finally, Apple will also be using ARKit to train you. Remember the time you first tried out the Nintendo Wii? Chances are that you were pretty cynical - but after your first game of Tennis with the motion controls you suddenly understood something that had previously been merely described in abstractions.

By the time the Apple Glasses do arrive — whether that’s in 2018 or 2025 — we’ll all be familiar with AR and how it works. It won’t be a weird new technology, and the second Tim Cook pulls his glasses from his shirt pocket and holds them up at an Apple press conference, he won’t need to explain; we’ll already understand.

James O'Malley is Interim Editor of Gizmodo UK and tweets as @Psythor.