The growing availability of large-scale personal data has put forward the challenge to cope with privacy as a major one. The EU GDPR legislation for data protection contributed to creating such awareness and introduced regulatory constraints to protect citizens’ data.
Governments and companies collect citizens or customers’ data to learn about their behavior through statistical analyses – which could be useful to improve services and sharing data with other parties for further benefits. In such scenarios, GDPR forces legal entities to share data while preserving users’ privacy.
In this webinar, we will present the principles behind DPella, a tool that perturbs statistical results with randomized noise to protect the privacy of citizens by design, while providing information about the accuracy of the results. DPella relies on Differential Privacy (DP) techniques and its novelty is how to calculate the accuracy of data aggregations. DPella is foreseen as a tool to enable governments and companies to produce open public data from private ones, which otherwise would remain closed due to privacy restrictions on how to handling it.
This talk is based on the article “A Programming Framework for Differential Privacy with Accuracy Concentration Bounds” by Elisabet Lobo-Vesga, Marco Gaboardi, and Alejandro Russo which was presented at the flagship conference IEEE Security & Privacy 2020.
Speaker Alejandro Russo
Alejandro Russo is a professor at Chalmers University of Technology working on the intersection of functional languages, security, and systems. His research ranges from foundational aspects of security to practical ones. Prof. Russo worked on prestigious research institutions like Stanford University, where he was appointed visiting associate professor back in 2013-2015.