Today we witnessed an exciting moment for privacy, CS theory, and many friends and contributors of this blog. The definition of differential privacy, first articulated in a TCC paper just 10 short years ago, became a top-level feature of iOS, announced today at the Apple keynote address. Check this out for yourself:
You may be intrigued as I am by the bold claim by Prof. Aaron Roth, who said that
Incorporating differential privacy broadly into Apple’s technology is visionary, and positions Apple as the clear privacy leader among technology companies today.
Learning more about the underlying technology would benefit the research community and assure the public of validity of these statements. (We, at Research at Google, are trying to adhere to the highest standards of transparency by releasing Chrome’s front-end and back-end for differentially private telemetry.)
I am confident this moment will come. For now, our heartfelt congratulations to everyone, inside and outside Apple, whose work made today’s announcement possible!
Transparency is of course good, but one of the advantages of differential privacy is that it isn’t strictly necessary. Namely, as long as (i) Apple are clear about the privacy definition (and parameters) they are using and (ii) you trust that they correctly implemented this definition, then you don’t need to know any more details. I’m of course keen to learn more details, but this is great news anyway.
Making an exception in this case would go against standard practice in security and cryptography. In some sense, it’d be even worse than letting Apple get away with claims that they offer encryption without telling us how. In the case of on-device intelligence the adversary is the data aggregator itself, i.e., Apple, which makes “trusting” them all the more problematic. (There are, of course, practical and pragmatic limitations to disclosures, but the default in the space of consumer privacy must be openness, and all exceptions must be well justified and thought through.)
Of course transparency is good! Ideally, every line of code that runs on our devices would be open to inspection. However, that is not how things are and we must trust Apple to do as they say.
The point I’m making is that, assuming you trust them (which must already be the case if you use their products), then differential privacy is a *much* clearer and stronger guarantee than than a vague promise to respect privacy, which is currently the standard.
The video is ~2 hrs long. Can you be more specific about what we should “check out”?
Yeah, the video links to an intro to coding app… and it’s not April 1st!
Thank you for flagging it! (They must have re-uploaded the video, or something). I updated the embedded link, but in case it becomes unsynchronized again, the relevant part is at the 1h:40m:53s mark.
It starts around 1:41:00. The link is apparently to the end of the segment rather than the beginning.