Apple got caught up in a game of “me too” during this year’s WWDC keynote address, but if you listen closely, you notice that it’s starting to pitch products not just on their design or their power, but on how they protect a user’s privacy. Meet Apple’s new competitive point of difference.
If you go back and look at Apple’s WWDC keynote, you notice that after almost every new feature announcement where a user’s data is analysed to present new information (smarter Spotlight; Proactive Assistance in Siri; Mail updates; Maps updates, etc), Apple doubles back to ensure users that the data being analysed isn’t being transmitted back to the mothership.
It’s curious, because it means Apple doesn’t get as much data on its users as it might like. Apple would probably love to know more about what its users are doing on their devices. Any company would. By analysing how customers really use their devices and analysing that so-called “big data”, companies are able to figure out heaps of stuff: how to sell things to different people; which features to pursue next; which features need improving, and lots more.
By sacrificing some of the big data potential on the iPhone and iPad, Apple is carving itself out as one of the most secure gadget companies in the world.
It’s a sales journey Apple definitely needed to go on, too. There have been instances of security flaws in Apple support that led to a journalist losing his digital life, an iCloud hack specific to Australia, and leaks in stored data in the cloud that led to celebrity photos being leaked. Apple had to do something to tell users that it’s safe to store your stuff in Cupertino again.
It’s even interesting to look at how Apple sold security back before these major breaches occurred. Take TouchID, for example. When Apple unveiled its fingerprint security system nestled under the home buttons of new gadgets, everyone was a tad wary about lending their biometric data to the fruit stand. Security experts said that passwords are still more secure, because if your fingerprint data is lost, you can’t just go and make up another one. Apple carefully explained that TouchID data lived on the device and was never transmitted to the cloud for safety. People nodded their heads, reasonably assured that this was a safe way to secure their devices.
Jump forward to this week’s WWDC keynote, and a slide on the security and privacy functions of Siri’s new Proactive Assistant features drew a round of applause from developers. Apple’s Craig Federighi said that all the information for Proactive Siri lives on a user’s device, and not in the cloud meaning that unlike Google Now, Apple isn’t mining that data for insights on its users. It’s arguably less serious than biometric security in TouchID, but still it drew a massive round of applause because Apple said that it’s different than “the other guys”, meaning Google and Microsoft.
And that’s not the only privacy feature announced by Apple at WWDC. Apple talked about everything from two-factor authentication for passwords right down to implementing six-digit passcodes on iPhone lock screens.
Now, it’d be naive to think that Apple isn’t collecting any information on how customers use their devices. It’s certainly collecting some, but it wants people to know that the information being collected isn’t personally identifiable or potentially sensitive.
Security talks from Apple used to be footnotes in presentations like this, now they’re announced like key features. Security and privacy features used to be said out loud to assure users it was safe to keep your digital life on an Apple device. Now it’s a key competitive advantage for Cupertino. Whether it works to keep all of our data safe from the black hats in future remains is a question only time can answer.
Luke Hopewell is the editor of Gizmodo Australia, where this story was originally published. He travelled to WWDC 2015 as a guest of Apple.