With a sense of urgency to lift Covid-related restrictions, open up our communities, and get economies ticking all over again, over 28 countries in the world have hurried to launch various forms of contact-tracing apps. A further 11 countries will soon launch theirs. Essentially, these apps are surveillance tools that make use of GPS data or Bluetooth to report your movements and/or contact points to yourself and/or the authorities. In addition to the country-led apps, leading groups of scholars and companies are in the process of developing others—the rare Google-Apple cooperation being the main corporate one.

The purpose of these apps is to help track the spread of Covid-19 and to break the contagion chain. By alerting you to the fact that you might be at risk of contagion, you will self-isolate for the requisite two weeks before popping your head out into the world again. Or you would seek medical attention for possible testing. Conversely, some of the apps will allow health authorities to conduct back tracing to warn people who have been in contact with someone who has tested positive about a potential risk of infection.

There are many problems with these apps despite the fact that the most promising among them, at least in theory, have data privacy firmly at their core (essentially, though, everything can be hacked). Firstly, they have never been beta-tested on a small group. Their intended purpose is a theoretical construct that is being rolled out with no real proof of efficiency or value. This despite the fact that most commentators agree that the boundaries to what is generally perceived as acceptable surveillance are—maybe forever—constantly shifting, for the worse.

Secondly, and just as fundamentally, these apps are being developed without the necessary groundwork: a) swift implementation of verifiable and repeated testing of many citizens—not once, not twice, but several times over; and b) well-functioning infrastructure, institutions, or systems to measure, monitor, or correct the failures that undoubtedly will arise, not least because of point a.

Currently, the percentage of testing in relation to population size is strikingly low.

Covid-19 Testing League Table (As of April 17, 2020)

Country Test Percentage of
Population
Tested
1. Iceland 39,536 11.59%
2. UAE 767,000 7.75%
3. Norway 136,236 2.53%
4. Switzerland 206,400 2.39%
5. Portugal 208,314 2.17%
6. Israel 187,250 2.09%
7. Germany 1,728,357 2.06%
8. Italy 1,224,108 2.06%
9. Spain 930,230 1.99%
10. Austria 169,272 1.88%
11. Ireland 90,646 1.84%
12. Denmark 87,024 1.55%
13. Australia 391,530 1.51%
14. Canada 503,003 1.34%
15. Belgium 139,387 1.20%
16. Russia 1,718,019 1.11%
17. Korea 546,463 1.07%
18. USA 3,448,157 1.03%
19. France 380,000 0.58%
20. UK 341,511 0.50%

 

So What’s the Problem?

  1. The apps will very quickly lead to a false sense of security and potentially haphazard behavior if these testing figures are not, simultaneously, vastly increased. With so few people being tested, you could easily be in contact with several, if not many, infected persons without getting the app’s alarm warning. Hence, you can yourself become contagious. You could also wrongly be marked as affected .
  2. Scientists and/or public authorities will only back trace truthfully and significantly if many many more citizens get tested. The only way around this would require the world’s leading data analysts to work round the clock and analyze every single person’s whereabouts in order to disaggregate the many contact points, and thereby estimate who the spreaders are. This is neither a secure nor a timely solution. At the same time, we must avoid penalizing people who do not feel comfortable using the app. Italy is currently discussing ways to reach the necessary critical mass of contact-tracing app users. No matter how this is developed and operationalized, these apps will materially impinge on our privacy. Making their use mandatory would not only violate individual freedoms, but also lead to challenges in court that could make the whole process uncertain.
  3. None of these apps consider surface-touch contamination, which is a typical phenomenon in large shared spaces, for example supermarkets.
  4. Your privacy rights are under threat.

Whilst many of us are longing for something to give us enough security to venture out of our lockdown, launching these apps prematurely, relative to testing scope, could be outright dangerous. In addition, without the necessary systemic mechanisms in place to check, monitor, and rectify unintended consequences of the use of these apps, the risk of unethical outcomes will be too high. This blog on lessons learnt from the use of technology in the 2014 Ebola outbreak names some of the most important: bias, illegality, privacy invasion, and ineffectiveness, all of which come at the cost of human life.

What to Do?

What then could be a viable, safer path forward?

  1. Governments must invest in the infrastructure and mechanisms needed to verifiably test and retest a much larger percentage of citizens. This includes financially supporting the production of proven tests, or the import of said tests.
  2. App developers must be very clear in their messaging and risk warnings: download now but don’t necessarily trust the app. Ideally, they would hold back the launch of their apps until a certain testing threshold has been met.
  3. We will need to realize that physical distancing and/or self-isolation is not over and precaution will need to be continued for a very long time. Indeed, Harvard researchers predict that some physical distancing will be required well into 2022 .
  4. The use of these apps must be voluntary and never entail sanctions against people unwilling to use them.
  5. We need to hold both app developers and deployers, as well as our governments accountable and push for transparent and ethical adoption and implementation of these new apps. We must demand that the data is ephemeral (transitory, or existing only briefly), that the apps themselves are temporary , and that the systems are governed by a broad group of representatives from all walks of life.
  6. Even with the above measures in place, we need to consider that many citizens need to download and use the app for it to offer sound and reliable results. Many vulnerable groups (for example, the elderly), however, do not use mobile phones. How do we protect these groups without confining them to their homes in isolation for months to come?
  7. Who controls and has access to the data is of primary importance. Put differently, organizations/authorities who have control over and access to the data also have control over the narrative, and can tell the rest of us what the state of affairs is. They could hide certain truths, exaggerate others. All data interpretation relies on who is doing the interpreting. This is why we need governance mechanisms but also data access rights for different stakeholders.
  8. We must have a centralized mechanism whereby we can safely report privacy breaches and adverse uses of the apps.

If This Is About the Economy, Involve Businesses and Unions/Workers

It is easy to assume that governments are supporting and hastily launching contact-tracing apps as a means to open up businesses again and get workers back to work. If this assumption holds true, it will be vital to include employers and unions in the planning, rollout, and deployment of these apps. In this regard, the following conditions, in addition to the ones above, should immediately be considered:

  1. That governments establish a tripartite body aimed at evaluating the apps’ risks and implications on return to work, including on workers’ physical and mental health.
  2. That any aggregated data be made available to employers and unions for review and interpretation.
  3. That whistleblowing systems are established where persons safely can report misuse or abuse of the app’s intentions.
  4. Sanctions should be established for breaches of these conditions.

In workplaces, the following conditions must be applied:

  1. To maintain that the use of location-tracing apps is voluntary, employers must be prohibited from demanding that a worker downloads and uses the app as a precondition of return to work. This includes through seeking “informed consent” from the workers individually and/or collectively.
  2. That no worker should be forced to hand over app data to their employer as a form of monitoring.
  3. That workers have a right to share app data with their union for evaluation and risk assessment.
  4. That all existing health and safety rules and agreements are followed.

Whilst time is of essence, caution is too. This crisis must not lead to the watering down of human rights and workers’ rights in favor of a quick fix app-based solution set on a very weak foundation. We need proper sunset clauses, transparency, and auditability arrangements, as well as institutional frameworks and infrastructure to deal with this. But first and foremost, we need much more testing.

 

This article reflects the personal points of view of the authors. This is part of our ongoing series on the coronavirus and its impact.