Since at least 2012, tech firms have consistently topped the list of companies in terms of market capitalisation, indicating the enormous financial power that the business of data wields. At the same time, governments around the world – including in South Asia – have been harvesting their citizens’ personal data and perfecting methods of digital surveillance.

Himal Southasian’s Shubhanga Pandey and Raisa Wickrematunge spoke to Anita Gurumurthy about the states’ and corporations’ perennial hunger for data, new forms of labour control, and possibilities of a Southern framework for data sovereignty.

You can listen to the full interview here:

This is an unedited transcript of the podcast. Please listen to the interview before quoting from it.

 

Shubhanga Pandey: This is Shubhanga Pandey with Himal Southasian and I’m joined by my colleague Raisa Wickrematunge. Today we’re speaking to Anita Gurumurthy, who has been working on issues of internet governance, data and surveillance, and is the executive director at IT for Change based in Bangalore. In this episode of the podcast, we’ll be talking about how our personal data has become an important tool for the state, and an important commodity for large corporations – and why that should be a serious concern for all of us. Welcome to Himal podcast Anita.

Anita: It’s a pleasure to join you.

SP: So, what brings you to Colombo?

AG: I’m here as part of a resource exchange to have a conversation with activists, researchers and social movements in this country, and exchange ideas about digital rights. And this has been organised by the Neelan Tiruchelvam Trust so that we can actually build out a kind of concerted strategy across countries, across borders, in order to think through civil-society strategies to address digital concerns.

SP: So, I’ll begin with a question on how governments in South Asia have been increasingly interested in centralising our information, from citizenship to biometrics to health and financial data. How true is the claim they make that this is all for governance and service delivery? And is it just the state or are there private parties that are also interested in expanding this trend?

AG:For a long time governance has always been a highly technicalised area. This is part of the new management theories and administration. And as a managerial endeavour, governance can be seen something that can be bettered, and bettered through efficient tools, and this is where digital technologies enter the picture. Computerisation and informatisation is a necessary part of the governance framework of the modern state or contemporary government. And usually citizen rights are put into this framework without really having an overarching political ideology. Having a database with information is seen as a necessary part of this approach or this ideology. And so at some level, you could say that the centralisation is a continuity of the older frame. But, at a new level, when digital technologies enter the picture, the centralisation happens at an unprecedented scale.

Databases become very important to governments, and the government actually works through several siloed databases, in different social sector departments, different industrial departments. But today, when you actually are able to centralise all of this through a digital ID program that enables the state to really be a police state. That is what activists have been calling out. And this also enables a nexus between the state and the private sector, state and big capital. In India, for instance, a lot of civil0society activists have been protesting the integration of the Aadhar or digital ID for opening bank accounts or procuring mobile phones. In fact, the Supreme Court held up that this was in violation of the privacy judgment we had last year.

Platform companies would not make the revenue they make, but for the ways by which they are able to surveil workers constantly through algorithms.

Even then, the state has hastened, in some ways, to pass an ordinance to allow the banking sector to have access to this kind of data. In fact, through the ‘know your customer’ feature, it has enabled – the ordinance has enabled – banks to directly have access to Aadhar information. This is again being resisted by civil society, but this is an ongoing battle, and what we see is that the powers that want this behind the state are very big entities, including domestic capital in India, particularly the emerging fintech sector.

SP: So, moving on to the private players, you’ve written about how different platform models, from Uber to Amazon, have really intensified this extraction of data and of value of profit, both from their users but also from workers, who are usually called ‘gig workers’ because they’re not employed by these organisations. Can you explain that extraction and why it’s a problem?

AG: There are two, three very significant areas of exploitation that have come to light in the gig economy. The first is that database control is actually a new way of disciplining workers. You know, it’s a kind of surveillance of workers 24 by 7. And this has become very, very central and core to the accumulation strategies of platform companies. They would not make the revenue they make, but for the ways by which they are able to surveil workers constantly through algorithms.

The second important issue is that workers in this kind of arrangement are denied the formal employment status that they ought to have. So, there is a fiction of the independent contractor. We all know that drivers are called driver-partners. So, this kind of vocabulary obfuscates the actual exploitation. Also, we see that there are new kinds of outsourcing chains, where Amazon uses contractors – in India it uses a contractor called Shadowfax to subcontract workers. So, this is a kind of passing on of liability so that it’s actually operating in a way, although it employs these drivers, it employs these logistics workers – whether it’s Uber or Amazon – they’re able to completely stay immune from any kinds of local obligations or liabilities in the law.

The good news about this is that workers are standing up against all of this. All over the world workers unions are trying to see how they can have a modicum of control over the data that they’re generating for companies. There are court cases where workers have demanded, particularly in the UK, that the company returns back some of the data it actually has and tells them exactly what kinds of data about workers it holds illegally.

SP: Some of this practice is also seen, I think, in textile industry, where this value chain allows for this kind of immunity to producers. I wanted to talk more about the data; you’ve talked about data and how it can be seen as a public good. Can you elaborate on that?

AG: In an era where everything is ‘datafied’, data in and of itself cannot be seen as the villain in the piece. There’s something about data as a resource that – I mean, it’s not oil, it’s really not uranium, it’s something else. So, data as a resource needs to really be given a certain normative anchor and a kind of value-based peg that allows societies to use it for what we really believe should happen for social justice and equity. So to take an example, if you want to help small farmers in any country, it should be possible for the state to step in and say that with existing and legacy data sets about, let’s say, the weather or about soil or about seeds, and with real time data from small farms, how can the state or how can public universities and how can farming communities collectively benefit from this data? That would be essentially a framing of public data. And as you can see, you know, there will be applications across different sectors, across different kinds of development challenges and questions that we’re facing.

So, a public data framework essentially is to reclaim and appropriate data from this kind of surveillance purposes, for the very purposes that will promote social advancement that’s rooted in equity and social justice. This should extend also to what’s broadly the artificial-intelligence paradigm. Everyone is talking about development and AI, and without a public paradigm for AI, we are not going to be able to get there.

Raisa Wickrematunge: Right now we have many companies who have a significant presence in South Asia, but they are operated and owned in the West. You mentioned some of them earlier: Uber, Facebook and Google. At the same time, we’re seeing how these platforms are being misused. The misuse of these platforms is impacting our region, particularly women and minority communities. So, given the fact that there’s this gap between the companies and end users, how do you think this gap should be bridged?

AG: What is normally understood is the kind of impacts that social media companies have on public discourse, and the contamination of the public sphere on account of this kind of free-for-all laissez faire social-media paradigm. While that is really important to check, we also have to understand that the world is getting platformised, and all of corporate activity is now based on some form of platform-based interfaces. In e-commerce you’ll see that it is between the business and the customer. But what you don’t see is also at the B2B layer, which is businesses interact with businesses through platform and data.

Within this framework, we should actually be looking at how the digitalisation of all of commerce and all of industrial activity can actually be regulated and governed from global to local scales. The binding treaty on transnational corporations is one very important move that has been supported by Ecuador and it’s backed by many social movements. Bringing digital companies to book is a very important activity and national regulation and policies are very important. We have a range of tools: policy tools at the national level, from taxation to competition, antitrust, etc.

In an era where everything is ‘datafied’, data in and of itself cannot be seen as the villain in the piece.

RW: In the context of political ads on platforms like Facebook, which is quite topical right now, you’ve argued the need for a regime of content governance. Just wanted to unpack that and ask what that would look like, and who would be the custodians, given that self-regulation of the platforms isn’t good enough, and also because there does seem to be a misunderstanding of what the media means on the part of the state. So, in Nepal for instance, there’s a law which requires social-media websites to be registered. In Sri Lanka we’re seeing social-media platforms blocked when there’s hate speech and misinformation spreading during crises. So how would a contents governance regime – what would it look like?

AG: There are many pieces that have to come together in order for us to have a social-media paradigm in which rights of users, rights of citizens are safeguarded. One important thing is that there is a traditional unease amongst activists who are rightly concerned about free speech and freedom of expression and undue interference from the state. But at the same time, we have to be very cognisant that, in the kind of virality that we are experiencing today, the nature of speech itself is changing.

A concrete example of that would be that we are not only talking about textual speech, we are also talking about memes, we’re taking about GIFs, we’re talking about forms of self-representation that go beyond traditional laws that used to point to, let’s say, indecent representation or something that was causing public harm. So, the ideals of public harm will need to be revisited. Once they are revisited, it’s important for the legislative framework to identify what’s desirable communication, what’s undesirable communication, and come up with a framework that then the courts can use to adjudicate properly.

However, every single case of aberration or uncivil speech cannot go up to the court, you know, you would simply have too many cases, and it’s really not pragmatic. This is where social media companies come in and there is a case for community standards and self-regulation. But community standards does not mean US speech standards. We really have to have engagement through activism with all these social-media companies at the local level.

A recent example of, you know, some kind of soul searching that they have done is TikTok. WhatsApp did the same soul searching a few years ago when messages were going viral, and then they said, we will have a techno-design fix and we will say, group messages, you know, you can’t forward messages to more than five people at a time. So that is a kind of corporate responsibility that we seek. Corporate liability also is very, very vital. And that really goes back to national laws, which have to adhere to human-rights principles.

RW: Building on from that, should the state be custodians of data, considering that in regions like ours the state has targeted and surveilled those who critique the government or the state? How could this be safeguarded in a context where there is no political or economic incentive to enforce both data privacy and transparency?

AG: So this is, I think, the worst of all paradoxes, right? We have big tech on the one side and we have big brother on the other side. It’s the kind of dilemma we are all living through and we’re grappling with what kinds of solutions, right. Part of the solutions are in the cultural realm, we need to dialogue with our neighbours, we need to dialogue with our relatives who send those awful WhatsApp messages. But at the same time, this game is not only to be seen as highly micro-cultural or micro-local. There is a big power game there and we need new kinds of democratic institutions to both check state power and to check corporate power. Solutions have to be both multilateral, international as well as local, and always, I think, going through processes of participatory democracy and consultation.

A couple of ideas come to mind, I think, at the national level, in addition to data protection laws, which are cognisant of privacy, we need a new institutional system for data ombudsmanship, to govern data as an economic resource, so that we don’t go to town breaking all kinds of laws, ending up with ecological disasters because of data extractivism. Data extractivism and extractivism of natural resources are but two sides of the same coin and allowing the way companies are actually extracting data is just going to lead to hyper consumption and a destruction of resources. So, we do need a kind of institution that is free of state interference, looking at data as an economic resource. It’s almost like setting up a judiciary that’s an independent body.

We also need to articulate a sovereignty framework for data, and today the European idea of sovereignty is that the user is the sovereign person. But sovereignty also for us coming from Southasia means many things. It means sovereignty of communities, indigenous groups, of groups that have been managing forests, preserving seeds. Therefore, building upon these ideas, we really need to articulate ideas of community data, collective data, data for cities, etc. And we have enough experiences to anchor these futuristic perspectives considering that we have a long history of communitarianism.

RW: Thank you, Anita Gurumurthy for joining us and speaking to us about the commodification of data and the possibility of reclaiming it as a public good.

This interview was originally published on Himal Southasian.