Privacy in the age of big data

This article was originally published in The Asian Age.

Personal data is freely accessible, shared and even sold, and those to whom this information belongs have little control over its flow.

In 2011 it was estimated that the quantity of data produced globally surpassed 1.8 zettabyte. By 2013, it had increased to 4 zettabytes. This is a result of digital services which involve constant data trails left behind by human activity. This expansion in the volume, velocity, and variety of data available, together with the development of innovative forms of statistical analytics on the data collected, is generally referred to as “Big Data”. Despite significant (though largely unrealised) promises about Big Data, which range from improved decision-making, increased efficiency and productivity to greater personalisation of services, concerns remain about the impact of such datafication of all human activity on an individual’s privacy. Privacy has evolved into a sweeping concept, including within its scope matters pertaining to control over one’s body, physical space in one’s home, protection from surveillance, and from search and seizure, protection of one’s reputation as well as one’s thoughts. This generalised and vague conception of privacy not only comes with great judicial discretion, it also thwarts a fair understanding of the subject. Robert Post called privacy a concept so complex and “entangled in competing and contradictory dimensions, so engorged with various and distinct meanings”, that he sometimes “despairs whether it can be usefully addressed at all”.

This also leaves the idea of privacy vulnerable to considerable suspicion and ridicule. However, while there is a lack of clarity over the exact contours of what constitutes privacy, there is general agreement over its fundamental importance to our ability to lead whole lives. In order to understand the impact of datafied societies on privacy, it is important to first delve into the manner in which we exercise our privacy. The ideas of privacy and data management that are prevalent can be traced to the Fair Information Practice Principles (FIPP). These principles are the forerunners of most privacy regimes internationally, such as the OECD Privacy Guidelines, APEC Framework, or the nine National Privacy Principles articulated by the Justice A.P. Shah Committee Report. All of these frameworks have rights to notice, consent and correction, and how the data may be used, as their fundamental principles. It makes the data subject to the decision-making agent about where and when her/his personal data may be used, by whom, and in what way. The individual needs to be notified and his consent obtained before his personal data is used. If the scope of usage extends beyond what he has agreed to, his consent will be required for the increased scope.

In theory, this system sounds fair. Privacy is a value tied to the personal liberty and dignity of an individual. It is only appropriate that the individual should be the one holding the reins and taking the large decisions about the use of his personal data. This makes the individual empowered and allows him to weigh his own interests in exercising his consent. The allure of this paradigm is that in one elegant stroke, it seeks to ensure that consent is informed and free and also to implement an acceptable trade-off between privacy and competing concerns. This approach worked well when the number of data collectors were less and the uses of data was narrower and more defined. Today’s infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share information online, most people have no understanding of what happens to their data.

The quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, “creating countless new digital puddles, lakes, tributaries and oceans of information”. The inadequacy of the regulatory approaches and the absence of a comprehensive data protection regulation is exacerbated by the emergence of data-driven business models in the private sector and the adoption of data-driven governance approach by the government. The Aadhaar project, with over a billion registrants, is intended to act as a platform for a number of digital services, all of which produce enormous troves of data. The original press release by the Central Government reporting the approval by the Cabinet of Ministers of the Digital India programme, speaks of “cradle to grave” digital identity as one of its vision areas.

While the very idea of the government wanting to track its citizens’ lives from cradle to grave is creepy enough in itself, let us examine for a minute what this form of datafied surveillance will entail. A host of schemes under Digital India shall collect and store information through the life cycle of an individual. The result, as we can see, is building databases on individuals, which when combined, will provide a 360 degree view into the lives of individuals. Alongside the emergence of India Stack, a set of APIs built on top of the Aadhaar, conceptualised by iSPIRT, a consortium of select IT companies from India, to be deployed and managed by several agencies, including the National Payments Corporation of India, promises to provide a platform over which different private players can build their applications.

The sum of these interconnected parts will lead to a complete loss of anonymity, greater surveillance and impact free speech and individual choice. The move towards a cashless economy — with sharp nudges from the government — could lead to lack of financial agencies in case of technological failures as has been the case in experiments with digital payments in Africa. Lack of regulation in emerging data driven sectors such as Fintech can enable predatory practices where right to remotely deny financial services can be granted to private sector companies. An architecture such as IndiaStack enables datafication of financial transactions in a way that enables linked and structured data that allows continued use of the transaction data collected. It is important to recognise that at the stage of giving consent, there are too many unknowns for us to make informed decisions about the future uses of our personal data. Despite blanket approvals allowing any kind of use granted contractually through terms of use and privacy policies, there should be legal obligations overriding this consent for certain kinds of uses that may require renewed consent.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s