Differential Privacy Explained

This is an interesting overview by Wired of the methods that Apple says it is using to provide the benefits of big data and machine learning without the attendant privacy compromises.

Differential privacy, translated from Apple-speak, is the statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it. With differential privacy, Apple can collect and store its users’ data in a format that lets it glean useful notions about what people do, say, like and want. But it can’t extract anything about a single, specific one of those people that might represent a privacy violation. And neither, in theory, could hackers or intelligence agencies.

This sounds great, but what strikes me is that it’s going to be hard to determine if differential privacy realizes concrete benefits to end users. On the one hand, if it works well, we may never know that that’s the case—it’s hard for people to appreciate the positive outcome of no personally identifiable information making its way into the wrong hands. What’s more, Apple is still playing catch-up in designing for services overall, so for the foreseeable future it may be hard to tell whether that’s the case because their product design is still evolving or because of inherent drawbacks to differential privacy—or some other reason. This is murky stuff.

Read the full article at wired.com.

+