Connexion

English version

PPS UMR 7126 – Laboratoire
Preuves, Programmes et Systèmes

Accueil · Présentation · Membres · Publications · Séminaire · Groupes de travail · Projets · πr²

A Language for Differential Privacy

Marco Gaboardi, University of Pennsylvania, États-unis

16 mai 2013 à 10H30, salle 1007, bâtiment Sophie Germain

Résumé: Differential privacy offers a way to answer queries about sensitive information while offering strong, provable privacy guarantees. Several tools have been developed for certifying that a given query is differentially private. In this talk, I will present one approach, based on a functional programming language named Fuzz, that we are developing at the University of Pennsylvania. One of the most common mechanisms for turning a (possibly privacy-leaking) query into a differentially private consists in adding noise to the result of a query. To ensure the privacy guarantee, the noise must be proportional to the query sensitivity: how much the result of a query can change with respect to linear changes in the input. Fuzz uses linear indexed types and lightweight dependent types to express a rich sensitivity analysis, and a probability monad to express randomized computation. The soundness of Fuzz guarantees that any program that has a certain type is differentially private. So, Fuzz can be used to certify differentially private a broad class of algorithms. Moreover, its type analysis can be used more in general to perform sensitivity checking and sensitivity inference.

Article présenté à POPL 2013