Apple Promises 'Differential Privacy'Security, Privacy Upgrades Follow FBI's Attempted Crypto Smackdown
Apple wants to sell you on "differential privacy."
"Differential privacy lets you gain insights from large datasets, but with a mathematical proof that no one can learn about a single individual."
Apple uses the annual event to preview forthcoming software and hardware for developers, and this year's event included a number of interesting-looking announcements, ranging from building Siri into Apple's desktop operating system and extending Apple Pay to websites, to an upgraded Home app to control your personal Internet of Things and better facial recognition for photos.
On the crypto front, with OS X 10.12 - code-named Sierra - that's due out next year, Apple's all-new Apple File System, a.k.a. APFS, will feature native encryption capabilities. AFPS will run on everything from Apple's watches and televisions to laptops and desktops, and allow developers to use single-key or multi-key encryption to protect data, even if devices' physical security gets compromised.
But amongst all of the announcements, one of the most puzzling is surely the promise of incorporating "differential privacy" into iOS 10, which Apple watchers expect to be released in September.
Promise: Privacy at Scale
Craig Federighi, Apple's senior vice president of software engineering, told the WWDC audience that on the iOS privacy front, Apple already attempts to do as many computations as possible on the device, and also uses end-to-end encryption for all iMessage and FaceTime messages.
Going forward, however, Apple says it wants to offer what amounts to privacy at scale, for example to spot patterns in how users are searching online, make better suggestions via the 'QuickType' keyboard or offer related links.
"We believe you should have great features and great privacy," Federighi told the WWDC crowd. "Differential privacy is a research topic in the areas of statistics and data analytics that uses hashing, subsampling and noise injection to enable this kind of crowdsourced learning while keeping the information of each individual user completely private."
Differential privacy essentially means collecting two different sets of data - one that includes your personal information, and one that has it stripped out - and ensuring that the latter data set is good enough to work with, says Johns Hopkins University cryptography professor Matthew Green in a blog post (see Top 10 Data Breach Influencers). He adds that differential privacy can be improved if random noise gets added to the mix.
The Usual Secret Sauce
What's unclear so far, however, is how Apple will be implementing this system, and whether it will stand up to security researchers' scrutiny, although Apple says it's vetted its approach with Aaron Roth, a professor at the University of Pennsylvania computer science department, who's literally written a book on differential privacy.
"With a large dataset that consists of records of individuals, you might like to run a machine learning algorithm to derive statistical insights from the database as a whole, but you want to prevent some outside observer or attacker from learning anything specific about some [individual] in the data set," Roth tells Wired. "Differential privacy lets you gain insights from large datasets, but with a mathematical proof that no one can learn about a single individual."
Some other researchers say they are waiting to see full details relating to how Apple has implemented its system before pronouncing it effective or not. "Unfortunately Apple isn't known for being terribly open when it comes to sharing the secret sauce that drives their platform, so we'll just have to hope that at some point they decide to publish more," Johns Hopkins' Green says.
On the upside, however, Apple didn't have to do this, Green says, adding that any moves in this direction offer reason for cautious optimism. "It sure looks like Apple is honestly trying to do something to improve user privacy, and given the alternatives, maybe that's more important than anything else," he says.
Life After the Apple-FBI Crypto Fight
Of course, it's impossible to view Apple's differential privacy move without making reference to one of the potential catalysts: the Snowden revelations. Those began three years ago, and have revealed mass surveillance campaigns being conducted by U.S. and U.K. intelligence agencies, programs designed to crack or suborn Apple operating systems - both OS X and iOS - as well as attempts by intelligence agencies to exploit even minor weaknesses to track or spy on targets.
Earlier this year, meanwhile, the U.S. Justice Department attempted to force Apple to create a version of iOS that would allow the FBI to access the contents of an iPhone that had been issued to San Bernardino shooter Syed Rizwan Farook. Apple CEO Tim Cook vowed that Apple would fight the "dangerous" move, and the FBI ultimately backed off, although experts say related legal battles have only been temporarily deferred (see Silicon Valley: Crypto Debate Continues). In the interim, the world's largest technology company shows no signs of slowing down on its promise to deliver devices that provide greater levels of privacy and security for users.