We propose a novel programming framework and system, ϵktelo, for implementing both existing and new privacy algorithms. For the task of answering linear counting queries, we show that nearly all existing algorithms can be composed from operators, each conforming to one of a small number of operator classes. While past programming frameworks have helped to ensure the privacy of programs, the novelty of our framework is its significant support for authoring accurate and efficient (as well as private) programs. After describing the design and architecture of the ϵktelo system, we show that ϵktelo is expressive, allows for safer implementations through code reuse, and that it allows both privacy novices and experts to easily design algorithms. We demonstrate the use of ϵktelo by designing several new state-of-the-art algorithms.
Differential privacy has become a primary standard for protecting individual data while supporting flexible data analysis. Despite the adaptation of differential privacy to a wide variety of applications and tasks, visualizing the output of differentially private algorithms has rarely been considered. Visualization is one of the primary means by which humans understand and explore an unknown dataset and therefore supporting visualization is an important goal to advance the practical adoption of differential privacy. In this work on private data visualization we explore key challenges and propose solution approaches. We use two-dimensional location data as an example domain, and consider the challenges of plotting noisy output, the impact of visual artifacts caused by noise, and the proper way to present known uncertainty about private output.
Differentially private algorithms are becoming increasingly complex, and in particular, the performance of many emerging algorithms is data dependent, meaning the distribution of the noise added to query answers may change depending on the input data. We propose a set of evaluation principles which we argue are essential for sound evaluation. Based on these principles we propose DPBENCH, a novel evaluation framework for standardized evaluation of privacy algorithms. We then apply our benchmark to evaluate algorithms for answering 1- and 2-dimensional range queries. The result is a thorough empirical study of 15 published algorithms on a total of 27 datasets that offers new insights into algorithm behavior—in particular the influence of dataset scale and shape—and a more complete characterization of the state of the art. Our methodology is able to resolve inconsistencies in prior empirical studies and place algorithm performance in context through comparison to simple baselines. Finally, we pose open research questions which we hope will guide future algorithm design.
Auto wipeout is a library which contains a FirebaseCloud Function triggered by account deletion (an Auth.delete event) that wipes out all the data in the Firebase Realtime Database that belonged to the deleted user. Compliance with privacy regulations requires that developers ensure that a user's data is deleted when they delete their account. To determine what data should be deleted, the Cloud Function analyzes the app's Security Rules, and identifies any data that can only be written by a particular user. (work done while interning at Google)