Lessons for India on Data Protection
The glaring misuse of data in the Facebook-Cambridge Analytica case is unfortunate. But there is a lot to learn from the revelations. It is also timely given the ongoing deliberations of the Expert Committee on Data Protection, recently constituted by the Government of India.
Here, I outline four questions that the Expert Committee should consider in light of the incident:
What types of data were misused?
According to a data set accessed by the New York Times, Dr. Aleksandr Kogan shared large volumes of data with Cambridge Analytica, including users’ identities, friend networks and “likes”. But it is still not clear exactly what types of data were compromised.
To ensure that all types of data are covered under the proposed law, the Expert Committee will have to look beyond narrow categories, and suggest a comprehensive ‘data classification’ framework — a taxonomy, based on specific parameters such as the nature, source and sensitivity of data. This will not be an easy task.
The first challenge is the sheer volume and complexity of data available today, which makes it difficult to classify them into neat categories. In theory, the law could classify certain types of data as ‘sensitive’ (for eg. bank details and passwords). But, as I learned when I downloaded a copy of my entire Facebook history, sensitivity is contextual. My ‘Facebook history’ contained private conversations with friends and family, photos sent to romantic partners, online communities I had participated in, events I had attended, books and movies that had influenced my life… The list was endless. What could this reveal about me? And who decides what is ‘sensitive’?
Second, is the ‘Big Data’ challenge. With new data analytics and aggregation techniques, it is not data in isolation, but rather the combining and repurposing of data that creates risk and anxiety. In fact, access to large data sets and powerful engines is what enabled Dr. Kogan and Cambridge Analytica to build detailed ‘psychographic profiles’ of Facebook users and discover intimate personality traits about them (for eg. their political opinions).
The Expert Committee will also have to consider entirely new types of data sets, which are being generated by machines on behalf of individuals. With Facebook’s foray into hardware devices and the rise of the Internet of Things, data ownership and privacy issues involving wearable devices, remote sensors and smart homes will need to be addressed. This data may not always be personally identifiable, but could reveal trends and patterns on a much larger scale. Unless adequate provisions are made in the law, the next data misuse incident could impact not only individuals, but entire communities, and even nation states as a whole.
Were users aware of what was happening?
The backlash from Facebook’s users makes it clear that the ‘consent framework’ is broken. Even though users had ‘accepted’ the permissions sought by Dr. Kogan’s app, it appears that consent was not freely obtained, given the information asymmetry and lack of real choices for users (Facebook introduced new privacy controls after the incident).
The Expert Committee will have to recommend a threshold for ‘informed consent’ — the specific requirements that must be satisfied to constitute a valid agreement. While existing models like the EU-GDPR can be replicated, the law should focus on design principles to safeguard individual autonomy. Adequate incentives should be built in so that companies like Facebook position user privacy at the center of all product development cycles. After all, ‘code is law‘.
Another issue that must be addressed is what I call ‘vicarious consent’ — a situation in which an individual grants consent on behalf of someone else to use their data. Vicarious consent is what allows Truecaller to store your mobile number even though you haven’t installed the app, Tinder to display your photo to strangers in the capacity of a ‘mutual friend’, and which enabled Dr. Kogan’s app to harvest the data of more than half a million Facebook users in India, even though only 335 users actually installed the app.
Worryingly, Facebook did not inform affected users, even after they became aware of the incident in 2015. The Expert Committee will have to recommend ways in which users are informed about data breaches, including incidents such as this one, so that users can make informed choices about how and when their personal data should be used.
What rights and remedies should be available to users?
Under existing law, users in India would have little recourse against Facebook and Cambridge Analytica. Fortunately, the government has stepped in to ask the tough questions on behalf of its citizens. Looking forward however, the Expert Committee must ensure that users are afforded a bundle of rights that can be exercised at will.
Users should have the right to seek information about how and why their data is being used. Reports suggest that Cambridge Analytica used the data from Facebook to develop targeted advertisements to influence their preferences and behaviours. To address such situations, the Expert Committee must enact a comprehensive ‘right to an explanation’ and strong rules around algorithmic transparency.
Users should also have more control over their data, including the right to restrict its processing, transfer it to another platform, or delete it entirely. This will raise a host of issues, including free speech restrictions and interoperability standards. But granting these rights will provide the backbone for a nation of digitally empowered citizens.
The Expert Committee should also look beyond individual ‘users’, and focus on community and sovereign rights. Today, communities, large and small, use digital platforms to engage publicly and privately. The law should ensure that data generated by such communities are not misused. Similarly, as India’s digital population surpasses those of most countries, it is imperative that the Government of India has requisite powers to safeguard its security and integrity and tackle such incidents in the future.
Who should be held responsible?
If anything, this incident proves just how difficult it is to hold parties responsible for their actions. Between Dr. Kogan, Cambridge Analytica and Facebook, there is a lot of blame to go around. But if we are to bring such incidents to closure, the Expert Committee will have to develop new rules on allocation of liability between parties.
Some countries place the primary responsibility on the ‘data controller’, which means the party that determines the purpose and method of data processing. But in a situation where both Dr. Kogan and Cambridge Analytica could be considered independent data controllers, not to mention Facebook itself, the Expert Committee will need to provide additional guidance on whom to hold liable.
Since most arrangements for data processing are entered into contractually, often involving a multitude of downstream entities and a complex web of relationships, regulators may find it difficult to establish a clear chain of command. In such situations, there is a need for greater transparency, along with tight controls on the use of APIs and data access rights.
Besides this, the Expert Committee will have to suggest methods to promote internal accountability and enforcement capabilities. Evidence suggests that Facebook did not conduct the necessary data audits and evaluate the potential harm to users, so the role of Privacy Impact Assessments should be studied carefully. External supervision will also be necessary, by establishing strong regulatory institutions to enforce the principle of accountability and prevent future mishaps.
If not, we will just have to make do with apologies.
(This article was originally published on The Wire)
- Lessons for India on Data Protection - May 31, 2018
- Connecting the Next Billion - December 31, 2015
- Redrafting the National Encryption Policy - September 25, 2015
- How Internet Services and Telcos are Regulated in India - May 8, 2015
- Why Licensing of Internet Services is a Terrible Idea - May 8, 2015