Accueil · Présentation · Membres · Publications · Séminaire · Groupes de travail · Projets · πr²
Résumé: Differential privacy offers a way to answer queries about sensitive information while offering strong, provable privacy guarantees. Several tools have been developed for certifying that a given query is differentially private. In this talk, I will present one approach, based on a functional programming language named Fuzz, that we are developing at the University of Pennsylvania. One of the most common mechanisms for turning a (possibly privacy-leaking) query into a differentially private consists in adding noise to the result of a query. To ensure the privacy guarantee, the noise must be proportional to the query sensitivity: how much the result of a query can change with respect to linear changes in the input. Fuzz uses linear indexed types and lightweight dependent types to express a rich sensitivity analysis, and a probability monad to express randomized computation. The soundness of Fuzz guarantees that any program that has a certain type is differentially private. So, Fuzz can be used to certify differentially private a broad class of algorithms. Moreover, its type analysis can be used more in general to perform sensitivity checking and sensitivity inference.