Differential Privacy for String Algorithms and Data Structures
Project description
The more data we collect every day, the more pressing is the need for tools which allow analyzing those data in an efficient and ethical manner. This includes protecting the privacy of individuals whose data are being analysed. However, it is not always obvious what it means to protect someone’s privacy: Even if the name and other clearly identifying data are removed from a database, a resourceful adversary can find ways of uncovering personal data. The rapidly growing research field of differential privacy aims at tackling this challenge with a rigorous, quantifiable privacy definition, which requires that any information released about a database, may not depend too much on any individual’s data.
The aim of this project is to develop tools for privately analyzing strings (sequential data) and to explore the theoretical (im)possibilities in the field.