Formalization of Differential Privacy in Isabelle/HOL
This program is tentative and subject to change.
Differential privacy is a statistical definition of privacy that has attracted the interest of both academia and industry. Its formulations are easy to understand, but the differential privacy of databases is complicated to determine. One of the reasons for this is that small changes in database programs can break their differential privacy. Therefore, formal verification of differential privacy has been studied for over a decade.
In this paper, we propose an Isabelle/HOL library for formalizing differential privacy in a general setting. To our knowledge, it is the first formalization of differential privacy that supports continuous probability distributions. First, we formalize the standard definition of differential privacy and its basic properties. Second, we formalize the Laplace mechanism and its differential privacy. Finally, we formalize the differential privacy of the report noisy max mechanism.
This program is tentative and subject to change.
Tue 21 JanDisplayed time zone: Mountain Time (US & Canada) change
16:00 - 17:30 | |||
16:00 30mTalk | Formalization of Differential Privacy in Isabelle/HOL CPP | ||
16:30 30mTalk | A CHERI C Memory Model for Verified Temporal Safety CPP Vadim Zaliva University of Cambridge, UK, Kayvan Memarian University of Cambridge, Brian Campbell University of Edinburgh, Ricardo Almeida University of Edinburgh, Nathaniel Filardo University of Cambridge, Ian Stark The University of Edinburgh, Peter Sewell University of Cambridge | ||
17:00 30mTalk | Formalizing the One-way to Hiding Theorem CPP |