Abstract
Kobbi Nissim
Georgetown University
Differential privacy is a robust concept of privacy. It brings mathematical rigor to the decades-old problem of privacy-preserving analysis of collections of sensitive personal information. Informally, differential privacy requires that the outcome of an analysis would remain stable under any possible change to an individual's information, and hence protects individuals from attackers that try to learn the information particular to them.
In this tutorial, we will motivate and interpret differential privacy as a privacy concept, we will mention some of the connections between differential privacy and other research ares, focusing on statistics and learning, and we will review some of the work towards bringing differential privacy to practice and the challenges in doing so. The presentation would be (mostly) self contained and geared towards providing background for the workshop’s breakout discussions.