Interview with Professor Jason M. Schultz
In our vast and complex society, there are often concepts and challenges that require a more nuanced understanding. It is at such instances that an interaction with experts in the field greatly enhances our ability to conceptualize these challenges. One of the biggest challenges of the 21st Century is presented by the emergence of Data collection and analysis techniques at a massive scale. "Data is the New Oil"- an oft quoted refrain- represents the multi-dimensional complexity of the subject. Companies such as Google, Facebook and Amazon have devoted considerable resources to building business models that are premised on advertising revenue generated from meta-data analysis of their users' online habits. While users are unlikely to agree to a paradigm where they pay for services such offered by Google and Facebook, the abilities of these companies to profile their users in very personal ways, ranging from eating habits to even sexual orientations, raises far reaching concerns of privacy and corporate power. We sit down with New York University Professor of Clinical Law- Mr. Jason M. Schultz to discuss these issues.
Professor Jason M. Schultz teaches Law at the New York University, United States. Between 2016-17, he served at the White House Office of Science and Technology Policy as the Senior Advisor on Innovation and Intellectual Property to U.S. Chief Technology Officer Megan Smith. He is also the Director of NYU's Technology Law and Policy Clinic and the Co-Director of Engelberg Center on Innovation Law & Policy. With Mr. Aaron Perzanowski, he recently co-authored a book on the changing nature of technological purchases in the digital world, the book is titled 'The End of Ownership: Personal Property in the Digital Economy'.
His primary areas of research are centered around the legal aspects of innovative technologies and intellectual property. Professor Schultz was also the co-author of a research paper that has been cited by the Justice BN Srikrishna Committee set up by the Government of India to discuss the various aspects of a Data Protection Framework in India. He also recently co-authored an article in the Indian Express on the twin frameworks of Big Data regulation and strongly argued in favor of a consent based framework for data collection.
Q. Let us begin with India's Aadhar Project. You have followed the project as an academic. How do you weigh the efficiency considerations of the project- given the scale of India’s social security apparatus and the need for weeding out corruption- against the privacy concerns of the same?
Ans. When I worked for President Obama’s Office of Science and Technology Policy, we championed technology as a means of accelerating government efficiency. There’s nothing wrong with that. But before recommending any new innovations, we undertook extensive cost-benefit studies to assess the pros and cons, including impacts on information privacy and security. The proponents of Aadhar clearly have good intentions, but I worry that the system wasn’t subjected to sufficient scrutiny and had it been, additional protections might be in place. For example, there are advanced methods of using computer encryption that would allow verification of identity without needing to record and surveil each person using the system. The Touch-ID feature on Apple’s iPhone is designed to work this way. It verifies your identity to open the phone but does not need to tell Apple every time you use it. If governments assess, test, and explore such approaches, we can have innovations that are both better for government and better for the people.
Q. Your co-authored research paper- “Big Data and Due Process: Towards a Framework to Redress Privacy Harms”- has been cited by the Sri Krishna Committee constituted by the Government of India to look into questions of data protection in India. Could you explain to our readers the subject and conclusions of your research?
Ans. The paper addresses a shift in privacy concerns for big data computer systems. Traditional privacy concerns have focused on websites that collect our information and then exploit it or disclose it. Big data systems have those issues too, but they also have the ability to classify and predict personal information about us, such as the case where Target Stores used big data to try to predict which of its regular customers were pregnant, and ended up shocking a teenage girl and her parents by mailing them coupons for baby diapers. The paper recommends that to prevent these and other serious “predictive privacy” harms, individuals should have due process rights when big data is being used to judge them, especially in areas such as law enforcement, housing, food, education, and other basic societal needs.
Q. In your paper, you have refrained from exactly defining the term ‘Big Data’, what are the issues, from a legal standpoint of giving out a precise definition of the term?
Ans. Laws always require some definitions, but it’s dangerous to get too caught up in a particular battle over definitions, especially when technological developments are moving fast. The field of data science has an evolving definition of big data and predictive analytics that can inform the debate, so we happily deferred to those experts to speak to the current boundaries. Either way, our framework works with the current approach.
Q. What exactly is ‘Big Data Exceptionalism’ and why should an average technology consumer be concerned about it?
Ans. Big Data Exceptionalism is the idea that somehow traditional privacy norms -- such as the need for informed consent and minimizing the amount of data you collect and use – no longer apply when we have complex data systems. That’s just simply not true. There are new challenges to address, but they can be addressed with additional protections. There is no need to strip away the traditional protections. Both can live in harmony with each other.
Q. Most of the applications that consumers use, such as Facebook, Google, etc., are available for free primarily because their business models are premised on meta-data collection of their users’ activities online. Given this business model, do you think it is possible to effectively implement a consent based framework for collection of user data?
Ans. Absolutely. Most people are quite comfortable trading some of their data for services, but as Professor Helen Nissenbaum has written, privacy is contextual. I may be fine with my friends knowing where I go for vacation, but not my boss. The key to controlling context is informed consent. If Facebook only shares my vacation photos with my friends, great – they make money and I’m happy. If it shares it with my boss, I want the power to delete those photos and my data from their servers.
Q. The UK and the EU are harmonising their Data Protection Regimes, with the UK planning to implement a new Data Protection Act as an amendment to its 1998 Act. In addition, the EU has also introduced the General Data Protection Regulation (GDPR). What is your opinion on the viability and future-readiness of these frameworks?
Ans. Both the UK and EU GDPR frameworks are complex, but fundamentally, they are about making sure companies like Facebook and Google are responsible for how they use our data and to make sure we have insight and oversight of what they do with it. I’m optimistic that they will help find a good equilibrium where the companies continue to innovate and thrive at the same time that consumers feel protected and well-informed as to how their data is used.
Q. How effective do you think existing US legal framework is in ensuring data protection? Are there any particular provisions under American law that you think can be imported into other jurisdictions?
Ans. Honestly, American privacy law is lagging behind. We have focused on specific sectors – health, financial, education – without a comprehensive data privacy law. In today’s networked world, this makes no sense. Data is transmitted and traded across all sectors of the economy; thus, we need a comprehensive approach.
Q. Recently a case emerged in Arkansas where Amazon contended requests from local law enforcement for recordings from a murder suspect’s Amazon Echo. They contended the request by citing First Amendment protections, arguing that the recordings should be considered to be free speech. The accused voluntarily agreed to cede the recordings, leading to an abatement of the petition without a decision. However, do you think this is a viable challenge to such requests?
Ans. It was a fascinating case, for sure, and I think a First Amendment/privacy argument could succeed, but Amazon can’t have it both ways. If they want to support user privacy, they need to make sure they have explicit user consent to use those recordings for internal business purposes. A simple click-through may not be enough, so I’d urge Amazon and other companies to do better.
Q. This was possible because of the exceptionally strong protections for free speech in the US. How should other democracies, which do not have such broad protections for free speech, proceed in order to ensure privacy is maintained as potential recording devices are voluntarily brought into homes?
Ans. Each country is different, of course, but India is an example of a democracy that has many lawyers and judges who understand the connection between human rights, privacy, and technology. Just as human rights law protects individuals against unlawful government surveillance or invasions of the home by police, it can do the same for digital privacy, especially situations involving devices that we have bought to help us modernize our own homes and lives.
Q. You have recently co-authored a book- 'The End of Ownership: Personal Property in the Digital Economy'. Could you explain to our readers what the book is about?
Ans. The book discusses a massive shift in the digital economy away from personal property rights to fragile and temporary rights in the objects we buy. From ebooks to smart phones to children’s toys, more and more devices are being sold to us in stores with hidden fine print that claims we don’t own these objects anymore but instead the companies that make them (and the software inside them) do. This has serious implications for our rights as consumers. If the companies own the things we buy, they can take them away arbitrarily, they can invade our privacy by using the object to spy on us, and they can prevent us from repairing them, reselling them or even gifting them to our families. The book documents this shift and recommends ways to restore consumer rights in the age of digital goods.
Q. Do you think that this new model of license based purchases- where a product can be taken away from the consumer even after the purchase- has a potential for interference with fundamental freedoms of choice? What can consumers do about it?
Ans. There are several options available before consumers. They can push companies and lawmakers to clarify what rights you do and do not have when you buy a particular item. They can also push for laws ensuring we always retain standard rights of property ownership – the right to resell, the right to repair, the right to privacy, etc. Through this mechanism, we can ensure a harmonious relationship between evolving technologies and consumer protection.
About the Author
Prashant Khurana is a student of Law at the Faculty of Law, Delhi University. He holds a Bachelor’s degree in History from Hansraj College, Delhi University. Prashant is an accomplished debater, and an active participant and organiser of Model United Nations Conferences and was recently invited as a Chairperson at the University of Kent, United Kingdom for their MUN conference. He has appeared as a guest panellist on Headlines Today (presently, India Today) News Channel and has also interviewed personalities such as Mr. Mani Shankar Aiyar, Dr. Sambit Patra, the Ambassador of Canada to India, among others.