Differential Privacy for User Data
Differential Privacy for User Data
The Problem:
Our client is a technology company with a large amount of user data. Cybersecurity incidents are increasingly common, and there are many methods of protection. As a precaution in case of a breach, the client had some preexisting anonymization procedures in place, such as suppression of personally identifiable information (PII). There are many mechanisms for de-anonymization of user data, and we were engaged to assess the effectiveness of their current anonymization procedures, identify any vulnerabilities, and propose improvements.
The Data:
The client's database included information on over 300 million users around the world, including demographics and other fixed attributes as well as behavior data that correlates to real world travel. We were given temporary access to both the anonymized data as well as the associated PII.
The Solution:
We successfully identified several attack vectors that would allow a malicious actor to de-anonymize users from the anonymized records, and were able to suggest additional privacy techniques that mitigated the risks to acceptable levels. Due to the sensitive nature of this particular problem and it's solutions, we are unable to talk about in detail. We are happy to share more information on differential privacy theory and application.
The Impact:
The client was able to quickly implement our suggested improvements, and mitigate the risk of user de-anonymization in the case of a data breach.