Article

BIG Data: A Challenge to Data Protection?

Dr. Subhajit Basu and Rhyea Malik assess the capacity of the existing Indian regulatory framework to withstand the challenges posed by pervasive data collection and big data processing.

  • Dr. Subhajit Basu
  • Rhyea Malik

INTRODUCTION

‘Big data’ in India is set to get ‘bigger’ with the recent launch of Reliance Jio. Through Jio, Reliance is targeting mid-to-low end customers and is striving to digitize millions in rural India by providing them data connectivity at low prices. As customers line in queues to join Jio, the rise in digital adoption is expected to result in greater content consumption and more digital transactions. This upsurge in data usage would yield an exponential increase in the quantity of big data generated. Jio intends to maximize growth and competitiveness by exploiting big data.

There is no consensus on the definition of ‘big data’, however, it is largely accepted that the significance of big data lies in its sheer volume as it ubiquitously flows from a vast array of sources such as Facebook posts, tweets, clickstream, online transactions, email, uploaded images, cookies, and the internet of things including smart watches, smart gear, smart lighting, and the like. Transcending mere records of internet usage, big data encompasses data created in real space, collected in real time, and pertaining to highly personal, sensitive behavioural patterns such as habits, likes, and dislikes, as well as travel, movements, health statistics, among others.

Accessing the previously untapped big data of India is ostensibly Reliance’s first step toward future wide-scale data mining. Data mining of big data is a ‘big business’ because it allows businesses to uncover patterns and trends within the data. The purpose of data mining is to obtain a greater understanding of consumer behaviour, which enables businesses to accurately predict purchasing habits of their consumers and streamline their operations. Beyond market predictions and informed business strategies, big data mining also empowers businesses to produce targeted advertising reflecting customer preferences.

Data mining for big data does facilitate more efficient use of resources and streamlined services, but it sometimes costs people their privacy. We argue that the benefits to be garnered from big data cannot simply be traded with privacy rights. Consequently, there is demand for stronger regulatory norms to govern the collection, storage, transmission, and usage of big data. We posit that there are three essential elements for the sustainable development of big data: transparency in personal data processing; robust user control over how their data is used; and the establishment of a comprehensive data protection framework.

DATA PROTECTION FRAMEWORK

India has no explicit statute relating solely to data protection and privacy. However, in the context of digital data processing, certain aspects of data protection are covered under the Information Technology Act of 2000 (‘IT Act’) and the Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules of 2011 (‘Data Protection Rules’).

Scope

Government bodies and individuals engaged in data processing are not covered under the Data Protection Rules. Only companies, including firms, sole proprietorships, and associations of individuals engaged in commercial/professional activities (collectively, the ‘body corporate’) fall under the purview of these Rules.1

In the context of data protection measures for data processing, the Rules distinguish between ‘sensitive personal data or information’ and other ‘personal information.’ ‘Personal information’ comprises information relating to a natural person which, in combination with other information, is directly or indirectly capable of identifying such natural person,2 and within its ambit exists the smaller subset of ‘sensitive personal data or information’: exhaustively pertaining to passwords, finances, health conditions, sexual orientation, medical records and history, and biometric information.3

The key limitation of the Data Protection Rules is that ‘personal information’ is confined to information capable of identifying a particular person. Information of personal nature pertaining to other persons that are knowingly or unknowingly captured in the background—as in the case of the internet of things—is not covered by these Rules. The limited scope of ‘sensitive personal data or information’ is an additional limitation. As technology like the ‘internet of things,’ facilitates ubiquitous data collection, other sensitive personal information

1. Explanation (i) to Section 43A, Information Technology Act, 2000
2. Rule 2(1)(i), Data Protection Rules
3. Rule 3, Data Protection Rules

such as location, habits, and activity among others should be encompassed by the purview of these Rules.

In respect to the anonymization of data, it can be argued that if the collected data is encrypted such that it is no longer capable of revealing identities, then the de-identified data would not qualify as personal information, and as such, would not fall under Data Protection Rules. However, Data Protection Rules encompass such ‘personal information,’ which, although not directly capable of identifying a person, may – in combination with other information available or likely to be available with a body corporate – do so.4 In the context of big data, this implies that even if data collectors anonymize individual datasets obtained from disparate sources so that the person to whom the data corresponds cannot be identified, if the identity of the person can be revealed upon aggregation of this data, then the anonymized individual datasets and the aggregated data would be classified as personal information under Data Protection Rules.

Privacy Policy

Rule 4 of the Data Protection Rules requires body corporate across the chain of data processing that engage in the collection, storage, or otherwise deal with, or handle ‘personal information,’ to publish a privacy policy on their websites. The privacy policy is to clearly delineate their data processing practices, the type of personal information collected, the purpose of collection and usage, as well as details of disclosure made to third parties, and the reasonable security practices and procedures adopted.5

Despite creating additional transparency, a privacy policy does little to actually prevent misuse of data. Still, it is critical to note that Rule 4 creates a special obligation upon the body corporate to ensure that its privacy policy is available for view to individuals who have provided information to further a lawful contract. The distinction between information collected under a contract and information otherwise obtained seems to ascribe a higher threshold of protection for data that is contractually obtained. This interpretation is supported by Section 72A of the IT Act, wherein penal liability has been established for persons, including intermediaries, for any wrongful disclosure of personal information secured while providing services under the terms of a lawful contract. When otherwise, under Section 43A of the IT Act, mere compensatory liability has been provided in case of any wrongful loss or

4. Rule2(1)(i), Data Protection Rules
5. Rule 4, Data Protection Rules

wrongful gain arising from the negligence of a body corporate in implementing and maintaining reasonable security practices and procedures for sensitive personal data or information.

Consent & Notice

Rules 5, 6, and 7 of the Data Protection Rules mandate that the body corporate obtain consent from data providers prior to any collection, disclosure, or transfer of data. Rule 5 also requires a body corporate to disclose the purpose of collection, intended recipients of information, the particulars of the collecting agency, and where the collected information will be stored, as well as the details of the intended use of the collected data.6 These stipulations are limited in their applicability to sensitive personal data or information. Considering how narrow the scope of sensitive personal data or information is, it leaves a large amount of data for processing without obtaining prior consent or making adequate disclosure to data providers.

Consent and notice do not fare well in the world of ubiquitous data exchanges. Where information is continuously collected through sensors on a real-time basis, it would be practically impossible to obtain prior written consent before each instance of data collection. Instead, services such as WhatsApp and Facebook have incorporated a perpetual consent on part of their users within their terms of service.

Further, it is rather difficult for data collectors to provide the particulars of the purpose and usage of the information collected in real time, and it would be impossible for them to identify the multiple hands through which the impugned data may pass in the future as with the continuous improvement in data processing analytics, newer uses of big data continue to be uncovered. This is further complicated by the notification requirement of the data provider to divulge the intended use of data, which is directly tied to the actual collection and usage of data. Data collectors are barred from collecting data beyond what is necessary for the function or activity of the body corporate7 and from using the collected data in any manner that is not disclosed.8 Data collectors also cannot retain the collected data for longer than is required to meet said purposes.9 The only way left for data collectors is to make wide disclosures of the potential use of data in their terms of service for the processing of all

6. Rule 5(1) and 5(3), Data Protection Rules
7. Rule 5(2), Data Protection Rules
8. Rule 5(5), Data Protection Rules
9. Rule5(4), Data Protection Rules

existing and future data, which ends up being completely unfruitful in terms of data protection.

The effectiveness of prior consent and notice itself remains doubtful. Individuals usually ignore such notices or face difficulty in understanding their scope given the complexity of data flows. Sometimes individuals have no choice but to agree to the terms of service to avail themselves of the desired service or product. As such, obtaining prior consent is increasingly becoming a symbolic exercise.

Opting-out

While availing the service, product, or otherwise, individuals may at any time withdraw their consent to share their data with the body corporate. Such withdrawal is to be indicated to the body corporate in writing. Once a person has opted out, the body corporate has the option to cease provision of the service or product for which the impugned data had been sought.10 Yet, no provision has been made to allow data providers access to their past data stored by the data collectors so that they may switch service providers. Further, the concept of data deletion has also not been recognized.

Reasonable Security Practices and Procedures

Section 43A of the IT Act requires a body corporate possessing or handling sensitive personal data or information in a computer resource to implement ‘reasonable security practices and procedures’ to protect such information from ‘unauthorized access, damage, use, modification, disclosure or impairment.’ Explanation to Section 43A clarifies that the design of these security practices and procedures may be specified in an agreement between parties or in any law being in force at the time. Consequently, it remains open for data processors to forge agreements with data providers regarding adoption of security measures for the protection of data. Such a discretionary stance favors data collectors instead of data providers because in actual practice individuals usually fail to grasp the finer points of such terms of service and trade off their personal information to access or acquire the service or product.

Rule 8 of the Data Protection Rules also lacks specificity, in that the rule gives body corporate the discretion to formulate their security control measures so long as their security practices and standards for the protection of information assets are commensurate with the nature of business. However, there is no clarity as to how this data security threshold shall be

10. Rule 5(7), Data Protection Rules

determined, leaving it to the discretion of data collectors to determine the extent of security measures to be put in place for data protection.

Any negligence in implementing or maintaining these reasonable security practices and procedures that result in wrongful loss or wrongful gain to any person becomes a liability inasmuch as a body corporate may have to compensate or pay damages to the affected persons.11 However, this liability has been narrowly defined to accrue only in respect of ‘sensitive personal data or information’ and with regard to ‘wrongful loss or wrongful gain’, which is to say not just the mere loss of privacy. Furthermore, excluding the minimal residuary liability arising under Section 45 of the IT Act, the unauthorized or negligent divulgence of other personal information, other than as obtained under the terms of a lawful contract, has not been penalized under the IT Act or Data Protection Rules.

THE WHATSAPP CASE

Data misuse concerns recently manifested in the matter of Karmanya Singh Sareen v. Union of India12 before the High Court of Delhi. In this case, privacy activists through a writ petition challenged the new terms of service of WhatsApp by virtue of which the application can share its users’ data with Facebook. When WhatsApp was launched in the year 2010, it had promised complete privacy protection to its users and had assured its users that their data / details would not be shared in any manner. However, with the change in ownership – acquisition of WhatsApp by Facebook for $19 billion – WhatsApp’s privacy policy has undergone a drastic change. Today, the account information of all those users who have not opted out of the new terms of service is being shared with Facebook as well as other group companies and is being subjected to Facebook’s deep and sophisticated data mining, for the purpose of targeted commercial advertising and marketing. Petitioners claimed that this unilateral action of revising the terms of service and taking away the privacy protection of users contradicts the most valuable, basic and essential feature of WhatsApp: complete security and protection of privacy.

Counsels for WhatsApp countered that WhatsApp values privacy of its users, which is evident from the fact that (i) it does not ordinarily retain messages of its users and (ii) offers full end-to-end encryption for its services such that WhatsApp and third parties cannot read

11. Section 43A, Information Technology Act, 2000
12. Karmanya Singh Sareen v. Union of India, W.P. (C) 7663/2016, High Court of Delhi. Available at http://lobis.nic.in/ddir/dhc/GRO/judgement/24-09-2016/GRO23092016CW76632016.pdf (Last visited October 16, 2016)

user messages. Moreover, all those users who are unwilling to share their account information with Facebook / other group companies are free to delete their WhatsApp account, using WhatsApp’s in-app ‘delete my account’ feature. Upon deletion, such information of prior users that WhatsApp no longer needs for operation of its services would automatically stand deleted. Specifically in respect of revision of its terms of service, they averred that WhatsApp had provided advance notice to its users, and only those users who have chosen to continue with the service are being bound by the revised terms, including terms relating to data collection and usage.

In response, petitioners contended that the consent obtained by WhatsApp for its new privacy policy under the revised terms of service is only a facade because not everyone who uses WhatsApp in India is equipped to read, much less comprehend, its terms of service. They further argued that this change in the privacy policy is contrary to the principles of estoppel and is against the right to privacy guaranteed under the Constitution of India.

Finding that the terms of service of WhatsApp are not traceable to any statute, the Court, at the outset, ruled that the present petition being in respect of a contractual dispute is not amenable to the writ jurisdiction of the High Court. Yet, surprisingly, it still went on consider the submissions of both the parties.

The Court held that users cannot now compel WhatsApp to continue with its original terms of service when the original terms entitled WhatsApp to unilaterally change its privacy policy and stipulated the continued use of WhatsApp service, post amendment of privacy policy, to be considered as “deemed consent” to the terms of the revised policy. Additionally, the Court observed that no relief can be granted under the Constitution of India as the legal position with respect to the “right to privacy” is, as yet, undecided. The Constitution does not specifically guarantee a right to privacy and the judicial interpretation that the Constitution does provide a (limited) right to privacy – primarily through Article 21 – is under challenge before the Supreme Court of India in the pending case K.S. Puttaswamy v. Union of India.13 As such, further to WhatsApp’s terms of service, the Court (i) directed WhatsApp to completely delete information / data / details of those users who have chosen to delete their WhatsApp account and (ii) so far as those individuals, who have opted to continue with the use of WhatsApp service, are concerned, restrained WhatsApp from disclosing their

13. K.S. Puttaswamy v. Union of India, (2015) 8 SCC 735

information / data / details, which was collected under the terms of the original terms of service, to Facebook or any one of its group companies.

One glaring failure of this case has been Court’s failure to consider the sanctity of the user “consent” as regards disclosure of their information / data / details to Facebook and/or other group companies and usage of such collected data for the purposes of data mining; and the scope and limitations thereof. In the decision, petitioner’s contentions qua incapability of a large number of Indian WhatsApp users to understand what it is that they are actually consenting to under the new terms of service remained unanswered.

We believe that this case highlights a lack of critical understanding of the concept of data protection14 in India. In particular, lack of understanding of the concept of “consent”. In our view, instead of focusing on the users being uninformed and incapable of understanding the terms and conditions, the petitioners should have argued that the consent must be freely given, specific and “informed.” While consent may be the condition for processing data under the Data Protection Rules, it does not mean that consenting people must surrender their right to privacy. In the context of big data, we argue that the complexity of big data analytics cannot be an excuse for failing to seek appropriate consent, particularly when there is a change of “purpose.” This lack of clarity qua the scope of “consent” under the existing laws underscores the need for a stronger regulatory framework to govern data collection and processing, over and above mere self-regulation, to meet the challenges and risks of big data.

MEETING THE ‘BIG’ CHALLENGE

Big data and a lack of progress regarding “right to privacy” have created the need for urgent government action to develop a comprehensive regulatory framework for data protection. We argue that while setting out such regulatory framework, the government should revisit the issues of collected data sensitivity and its corresponding degree of protection; the right of data providers to access their data at any point in time; the right to transfer data to a different service provider; the right to have data erased (right to be forgotten) from the servers even when such data was collected with consent; obligation of data collectors/processors to keep the collected data secure; obligation to inform data providers when data has been breached; standard of accountability of data collectors/processors; sanctions and remedies in case of data breach; and reasonable means for data providers to exercise their rights.

14. Data Protection legislations require body corporate to process personal data fairly and lawfully. The main purpose of these principles is to protect the interests of the individuals, which creates a perception of privacy.

The EU data protection law is increasingly promoted as the gold standard for data protection laws around the world as it strives to balance commercial concerns of data processors with privacy protection. While it stipulates strict accountability obligations upon businesses dealing in data controlling and processing, and further lays down steep fines in case of non-compliance, yet, it ensures smoother flow of data across data exchange channels by establishing uniformity in regulation of data processing. The EU law boasts of an expanded territorial reach as it covers those data processers based outside of the EU whose data processing activities relate to EU data subjects. Further, the law provides for a wider definition of ‘personal data’ by specifically including within the definition information relating to: location data, online identifiers, physical, physiological, genetic, mental, economic, cultural and social identity of a person, as long as such information is capable of identifying the person, either directly or indirectly. The law also shifts its focus from merely mandating notification of data collection to requiring data controllers engaging in high risk operations to set up effective procedures and mechanisms for data protection and carry out impact assessments, for ascertaining the likelihood and severity of data breach. As a result, more countries have adopted not only its uniform approach to regulating the processing of personal data but have also accepted its concept and definition of personal data as correct. While conceding that the EU law is the model standard for legislating on data protection, the government can also examine other legal aspects of data processing captured in the EU law, including the right of data providers to object to their data being processed for direct marketing purposes, regulation of profiling based upon collected / collated data, deletion of data upon completion of processing, data protection impact assessment to ascertain the risk involved, privacy by design, and data lifecycle management.

The government can also seek to eliminate data protection measures that have become obsolete. The quantitative limitation upon collection of data could be removed as the ubiquity of big data is embraced, data usage consents could be detangled based upon essentiality and non-essentiality of data for provision of goods or services, and the qualitative limitation upon data collection could be expanded.

The government should also establish stricter laws against data misuse. As is evident from the WhatsApp case, the current approach of requiring the purpose and usage of collected data to be bound by contractual terms has proven to be insufficient for protecting data providers. Therefore, certain regulatory limits must be placed upon the manner in which collected data may be lawfully used. One possible limitation upon data usage could address the use of data for purposes that are incompatible with the original intent of data collection. However, this doesn’t necessarily imply that data collectors should explicitly and limitedly specify purposes for collected data, as any such requirement would effectively end the inherent explorative aspect of big data mining. Yet, to keep data usage in check, data collectors and processors could be obligated to ensure transparency and traceability of data usage. Herein, achieving a balance between the data protection principle of purposive limitation and the object of big data processing is the key to facilitating data mining.

As newer purposes of big data processing continue to evolve, even as the government establishes a regulatory framework for data protection in India, it should make sure that this framework is fluid enough to keep pace with evolving technologies. The government should also develop more coherent policy outside the remit of data protection, including in the area of consumer protection, for the holistic protection of data providers.

TO CONCLUDE

Despite the many prospects and benefits of big data analytics, big data processing poses serious risks to privacy. The question here is not whether to apply data protection laws to big data, but how to apply them innovatively. In the absence of a specific data protection framework and with the growing ubiquity of data collection, the limited protections of the Information Technology Act and Data Protection Rules make it increasingly difficult to protect data privacy. These challenges are particularly evident in traditional strategies of privacy policy adoption, the requirement of consent and notice, and provision for opting out. Thus, it is imperative that the issues of privacy and data protection in the context of big data be immediately taken into account and a comprehensive data protection framework, in tune with the latest technological advancements in the context of big data processing, be set out by the government.

DR. SUBHAJIT BASU is an Associate Professor at the School of law, University of Leeds and RHYEA MALIK is an Advocate based in New Delhi.
Top