The more we think about data privacy, the more we realize how complex it truly is. Most of the changes we’re making to our systems and processes in order to be compliant with GDPR, and now CCPA, revolve around building in features to protect the rights of consumers established within these various privacy regulation frameworks.
From both the technical and the legal side, there are new capabilities that are exciting and offer an incredible capacity for use cases we haven’t even considered. Some of the topics discussed at the RSA 2020 Conference that are worthy of consideration focus on these new and emerging services, and how they must be viewed through the lens of personal privacy.
What is the purpose of privacy?
For a person walking down a public street, why is facial recognition technology considered a privacy concern? Or for a person posting public pictures, why is their profile picture considered private?
For one, the collection of images by street cameras identifies a person at a specific street corner at a specific moment in time. Why is that a sensitive issue when we walk around with our faces exposed all the time?
One central theme of privacy is the issue of correlation. In other words, who should have access to data that tells them I drove past a street camera at one o’clock in the morning? Or better yet, who should know that I was tagged in a social media photo drinking alcohol at a particular party mere minutes before I was driving that car?
These data points can clearly be extremely useful for law enforcement for catching drunk drivers—or even for recovering abduction victims!—but they can also be a treasure trove of individual behaviors that can ultimately be correlated and reconstructed to essentially invade our right to reasonable assurance of privacy.
Promising technology or privacy test?
Technology providers and governments are engaged in a land-grab to use advances in machine learning and artificial intelligence to correlate as much information as they can about the citizenry—all with the best intentions, but not always with the best privacy results. With so much at stake, data privacy cannot continue to be taken lightly by any organization. Therefore, it’s imperative that we combine our use of personal data first with a level head for why we are doing this, and second with a keen eye for how technology can best support these challenges.
Recent breaches of sensitive client information at Clearview AI shed light on inadequate security controls maintained by an organization providing facial recognition services to law enforcement worldwide, all while making questionable use of over three billion images pulled from our own social media posts and feeds.
Similarly, without proper legal guardrails, genomic research intended to help us learn a bit about our ancestry or even solve murders, could easily lead to employment, healthcare, or cultural discrimination of individuals or even entire populations based on their shared genetic markers.
One exciting technology—some have even called it “transformative”— is homomorphic encryption. I think it is certainly worthy of further research. Homomorphic encryption’s capabilities are still relatively new, but for research applications, we can take private data, process it through an external system for statistical correlation and other CPU-intensive evaluation functions, and return meaningful results without ever exposing the underlying PII outside the original system. The resulting data sets are mathematically proven to be a statistically sound analysis, but ultimately remain encrypted, enabling research while protecting underlying data.
What types of data privacy concerns are you looking at in your organization? If you’re considering different encryption technologies, ControlScan can help you think through the best solution to meet your goals. Simply complete the “Request Information” form on this page and one of our experts will reach out to follow up.