At its WWDC conference last night Apple announced a host of new features for iOS, including more predictive software like QuickType and advanced spotlight search. In order to make these features work, Apple’s deep-learning robots will need to analyse a lot of people’s data at once, so Apple announced the integration of a new feature called differential privacy. But a cryptography expert isn’t sure the experimental technique is ready for primetime.
“One of the important tools in making software more intelligent is to spot patterns on how multiple users are using their devices,” Apple’s Craig Federighi said at the WWDC 2016 keynote. “Differential privacy is a research topic in the area of statistics and data analytics that uses hashing, subsampling, and noise injection to enable this kind of crowdsourced learning while keeping the information of each individual user completely private.”
Basically, Apple will inject fake data into the dataset they collect from all of their users in order to make it difficult to identify a single user.
See how Federighi refers to differential privacy as “a research topic”. Matthew Green, a cryptography professor at John Hopkins, thinks that differential privacy is not only relatively untested, but possibly dangerous. During the keynote, Green posted a series of sceptical tweets about Apple’s use of differential privacy, including this one:
Most people go from theory to practice, then to widespread deployment. With Differential Privacy it seems Apple cut out the middle step.
— Matthew Green (@matthew_d_green) June 13, 2016
Of course, exactly how secure Apple’s version of differential privacy will be depends on how Apple actually plans to implement it. For now, Apple has offered up no details. While Apple previously kept all of your data on your device, the new iOS features will now be analysing user data in aggregate.
Apple explained differential privacy in an email to Gizmodo:
Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.
“So the question is, what kind of data, and what kind of measurements are they applying it to, and what are they doing with it,” Green told Gizmodo. “It’s a really neat idea, but I’ve never really seen it deployed. It ends up being a tradeoff between accuracy of the data you are collecting and privacy.
“The accuracy goes down as the privacy goes up, and the tradeoffs I’ve seen have never been all that great,” Green continued. “[Again] I’ve never really heard of anyone deploying it in a real product before. So if Apple is doing this they’ve got a custom implementation, and they made all the decisions themselves.”
After developing its differential privacy technology, Apple showed it to Aaron Roth, an associate professor of computer science at the University of Pennsylvania. Roth offered a complimentary but vague statement that Apple displayed during the keynote:
But we won’t really know how well this new feature protects your privacy until Apple provides more details on how it’s done. We do know that Apple is opening up its software more than ever, and we can assume that many hackers will try to find new ways to break through its historically strong security.