Guilty until proven otherwise in the era of ‘perpetual digital search warrants’

In the age of ‘digital search warrants’, the principles of natural justice do not apply and there are no set parameters, breaches of which will constitute an inquiry into the data of an individual. The warrants become active as soon as the digital footprint of any individual is established and are open-ended, perpetual in nature without any termination.

by Shikha Mehra - September 10, 2020, 1:39 pm

In this series, my aim is to question assumptions, examine perspectives and challenge ideas. In my third piece, that I hope all of you will read and get entertained by, I will be examining privacy and freedom through the lens of a “Perpetual Digital Search Warrant” and Surveillance Capitalism.

Let’s start with a peculiar story from Minnesota, USA, where the father of a 16-year old girl was informed about the pregnancy of his daughter through coupons received via mail for baby products. The coupons were sent by Target, a leading retailer that collects data for every purchase made along with the details of the purchaser. This data is then processed by a third party that analyses the purchasing habits of customers and forecasts potential pregnancies and delivery dates. These forecasts are termed ‘prediction products’ by Shoshana Zuboff in her book, Surveillance Capitalism, and are later traded in what she describes as “behavioural futures markets”. Through the selling of these prediction products to advertisers, various applications on my smartphone magically show advertisements of products that I only thought of buying. Unfortunately, due to the ever impeding surveillance, the algorithms designed to predict behaviour through data processing are accurate! As was in the case with the 16-yearold girl in the US, where the father knew less about his daughter than did an algorithm!

While Google spied on its customers through a home surveillance device with an unauthorised listening device, Facebook was convicted of illegally collecting identifiable personal information of Belgian nationals through the use of cookies.

Google spied on its customers through a home surveillance device with an unauthorised listening device, Facebook was convicted of illegally collecting identifiable personal information of Belgian nationals through the use of cookies and Amazon Echo constantly collects data of personal conversations from one’s living room. This profitable but privacyinfringing modus operandi of big-tech companies is primarily possible due to the monetary value attached to personal data. If data were to be viewed as a public good that can only be utilised for purposes beneficial to society, it would significantly de-incentivise private companies from gathering and processing personal data. However, even if personal data is continued to be viewed from the prism of economic value, there must be a monetary value attributed to the act of consensual sharing of data by the true owner of such data, the user. Secondly, the extraction of personal data through overarching surveillance without the consent of the user has become the norm for private entities, creating a society wherein individuals are forced to make decisions not upon the exercise of free will but through the predictions of an algorithm based upon metadata that has been made available in abundance and whose volume has been growing exponentially. Here are just a few key numbers scaled to a monthly basis (2018) for fun:

• 42,033,600,000 Facebook logins

 • 159,840,000,000 Google searches

• 1,641,600,000,000 WhatsApp messages sent

• 8,078,400,000,000 emails sent

Algorithms get smarter and more accurate in their predictions as the dataset available to them grows in breadth and depth. It is also worth noting that in 2019, on an average, we humans spent 42% of our waking time online and that number could possibly be much higher for 2020 and beyond given the current exodus of activities from physical to digital media.

What are the repercussions of such unauthorised and unreasonable surveillance when carried out by the government or political parties? Cambridge Analytica used psychometric profiling to influence the electoral process in the United States. It processed information entered by users in a personality test on Facebook to categorise voters into various profiles. This information was used to predict the political preferences of potential voters, through micro-messaging, and the potential voters were subjected to specific campaigning to influence them to vote for a certain nominee. Similarly, in the 2017 Kenyan elections, real time bidding was used by political parties to target a specific demographic of voters. Real time bidding allows for the sale of advertisement spaces on websites. Through the use of cookies, websites profile individuals based on certain characteristics like age group, sexual orientation, political leanings and geographical location. These advertisement spaces, when sold to political parties, enable them to influence certain demographics and tarnish the image of opposition parties through false news and propaganda. In India during the 2019 general elections, mobile phones provided free of cost by the state were used to collect personal information, leading to the profiling of voters. Later, through specific campaigning, voters likely to vote against the ruling party were nudged to vote against their natural political leanings. The process of voting is anonymous by the design of the secret ballot, ensuring complete control of the individual over his choices. However, through surveillance over an individual’s activity online, algorithms predict voting behaviour, which is then used to influence voters to vote against their free will. This unwarranted intrusion into an individual’s private sphere results in the abrogation of the most fundamental aspects of democracy — freedom to make our own choices.

This intrusion is not limited to the buying choices or the electoral process. Governments are adopting new methods to curtail the sphere of privacy of an individual under the guise of protection against terrorism, money laundering and other illegal activities through the use of outdated privacy regulations incompatible with the digital world. In the United States, a 50-year-old doctrine of “reasonable expectation of privacy” grants law enforcement agencies the authority to access the private data of an individual shared to a third party digitally without the need for a warrant. A notification issued by the Home Ministry of India in 2018 grants permission to certain investigative authorities to intercept, monitor or decrypt any information generated, transmitted, received or stored in any computer resource in India. This notification circumvents the principles of due process, wherein personal details stored digitally can be accessed by law enforcement agencies even if the physical copies of similar documents would require a court-issued warrant.

Astonishingly, despite the call for digitalisation of the economy through demonetisation and digital storage of personal records through Aadhaar and the National Digital Health Mission, India has not enacted any laws establishing the fundamentals of privacy in India. Due to the lack of any statutory privacy infrastructure in India, surveillance is still considered to be the norm to which privacy is an unwanted hindrance. The Chinese government in Kashgar employs military level surveillance for a civil population through facial recognition and surveillance of every aspect of an individual’s being such as religious donations, foreign travel, sleep cycles, economic activities and DNA. In most jurisdictions around the world, a citizen’s freedom can only be curtailed by the due process of law and a search into private properties or documents can only be conducted by law enforcement agencies on the grounds of reasonable suspicion and probable cause, but evidently, these practices do not apply to the digital world.

In the age of ‘digital search warrants’, the principles of natural justice do not apply and there are no set parameters, breaches of which will constitute an inquiry into the data of an individual. The warrants become active as soon as the digital footprint of any individual is established and are open-ended, perpetual in nature without any termination. And in this way, the State has granted itself access to all personal and sensitive information of all individuals. Even without probable cause, every individual is under surveillance and is presumed guilty until proven otherwise.

Our understanding of the reasonable expectation of privacy has to evolve to reclaim the expeditiously deteriorating principles of freedom in a digital world. Alan F. Westin in his book, Privacy and Freedom, identified limited and protected communication as two crucial aspects of personal privacy. Westin’s book was published in 1967, but his understanding of privacy could significantly decrease the invasion of privacy in 2020 through the use of distributed ledger technologies (DLT). Simplistically, a DLT stack generally comprises of a decentralised network of nodes, each containing information that is verified by a mathematical process ensuring transparency, integrity, security and privacy. Information stored on public blockchain networks can be accessed by any user of the network (in a permission-less manner), whereas private blockchain networks restrict access to only ‘permissioned’ parties. Increasingly hybrid blockchain networks are being designed and implemented within the private sector and the same model can be replicated within the regulatory sphere. These blockchainbased data management systems can be designed to implement principles of natural justice in the digital realm such that only when there is reasonable doubt of suspicious activity, smart contracts can be designed to trigger the release of personal information to relevant law enforcement authorities.

The spirit of the underlying social contract between the State and the individual in civilised societies provides for a right to freedom and privacy, unless there is reason to believe that a particular individual could prove to be a menace to society, according to the prevailing laws of the country. The social contract between the State and an individual can be programmed and autonomously executed in the digital world and the “due process of law” followed rather than the prevalent system of excessive and universal surveillance. Finally, distributed and decentralised technology stacks that grant conditional control to users over their data need to be adopted globally to regain freedom and self-sovereignty in the digital space, while balancing the competing needs for individual privacy and surveillance by the State. 

The writer is the co-founder of MainChain Research & Consulting Pvt Ltd.