Privacy norms and the pandemic

Will things like digital contact tracing leave a legacy of better privacy norms, or worse ones?

The conversation about privacy and the pandemic — and about the idea of digital contact tracing in particular — has shifted a great deal in the last few weeks. It’s moved from an understandable ‘we don’t want to live in this dystopian science fiction novel’ gut response (my initial gut reaction) to a vigorous debate about whether privacy-by-design and good data governance make it possible to trace COVID contacts in a way that we can all trust (I’m still trying to sort through all this). A recent tweet from @hackylawyER summed up the state of the conversation nicely:

I’m not happy things are headed this way but if they are, we’d damn well better have safeguards.

Watching all of this, I’ve tried to step back and ask myself: what exactly are we worried about? And, amidst the rush to tech solutions, is this all downside for privacy and data governance, or is there an upside? Could we actually use this moment to set new and better norms for privacy?

What are we worried about?

The gut reaction worries are obvious: tracking everyone (or the bulk of people) with a smartphone to monitor COVID exposure could go wrong in myriad ways if done poorly or if the data falls into the wrong hands. And, on top of it, many have called into question whether this kind of tracking is even effective at stopping the spread of the virus. There is a legitimate worry that we could quickly find ourselves inside a huge mass surveillance experiment that has limited or no return in terms of public health and safety. Questions about efficacy and privacy are step one in considering whether or not to roll out contact tracing. While it’s far from universal, a fair number of governments are digging into these questions in earnest.

There is also a longer term, potentially more serious worry that seems to be getting less consideration: that governments, tech platforms and telcos working together to track citizens at scale becomes normalized in democracies. And, that this kind of surveillance gets used for reasons other than tackling the pandemic. Earlier this week, an open letter on digital contact tracing by scientists and researchers from 26 countries noted that:

… some “solutions” to the crisis may, via mission creep, result in systems which would allow unprecedented surveillance of society at large.

Centralized approaches like BlueTrace in Singapore include the collection of significant information about citizens and their contacts. And there are indications that some governments are pressuring Google and Apple to modify their decentralized approach to provide health officials with more information. As we learned from the Snowden experience, the in-the-moment desire to collect information about citizens in crisis can lead to a systemic invasion of privacy that lasts decades. 

A few weeks ago, I was worried that this was where we were headed: increased government and tech company surveillance as the new norm. The rallying of the privacy and data governance communities has me cautiously hopeful that we could go in the opposite direction. There may be a chance to use this moment to set new and better norms for privacy.

Embracing privacy-by-design

One source of this hope has been the rapid momentum that has grown behind decentralized, Bluetooth-based approaches to contact tracing. This general approach was initially proposed by academic groups like DP3T in Europe and PACT in the US, and was picked up by Apple and Google as something they could roll out across all their smartphones. The idea is to use Bluetooth to collect contact data locally on phones, leaving it there (and private) unless a person tests positive for COVID. In that case, a set of ‘beacons’ informs possible contacts that they may want to self isolate and get tested. Governments don’t get access to any of the contact data, striking a balance between public health and privacy. This comic explains the concept better than I can.

The hopeful piece here is not just the decentralized approach itself — it has pros and cons — but more importantly the quick embrace of privacy-by-design by governments, tech platforms and academics. As a Chaos Computer Club blog post notes:

In principle, the concept of a “Corona App” involves an enormous risk due to the contact and health data that may be collected. At the same time, there is a chance for “privacy-by-design” concepts and technologies that have been developed by the crypto and privacy community over the last decades. With the help of these technologies, it is possible to unfold the epidemiological potential of contact tracing without creating a privacy disaster.

As the post notes, the idea that privacy should be a foundational part of any digital product and service design has been around for decades — but it has been an uphill battle to make this way of thinking mainstream. The current setting may offer a chance for this way of thinking to make a leap forward, and to nudge governments and tech companies towards the idea that privacy-by-design should be the norm. 

Good ideas for governing tech and data

While receiving less attention, there has also been a wave of constructive work on how to govern contact tracing efforts. As a leading proponent of decentralized contact tracing said in a recent tweet:

Contact tracing apps, even private ones like #DP3T, need more than technical safeguards. @lilianedwards has been leading on our effort to draft a Coronavirus (Safeguards) Bill for the UK Parliament — which limits what these apps can be used for in practice.

Ensuring digital privacy is not only a matter of technology, but also a matter of rules, policy, oversight and stewardship. Getting privacy right — and making sure we don’t slide into ‘unprecedented surveillance of society at large’ that this recent letter from scientists warns of — will require us to develop smart approaches to governing any technology we put in place to tackle the pandemic. Unfortunately, smart data governance is even less commonplace than privacy-by-design. 

The good news is that thoughtful and practical data governance proposals are quickly emerging. For example, the draft Coronavirus (Safeguards) Bill mentioned above would place strict purpose, access and time constraints on any technology that was rolled out to manage the pandemic. It also addresses topics related to inclusion, ensuring that no one is penalized for not having a phone. Others have called for the creation of independent ‘data trusts’ or trust-like mechanisms to ensure the interests of citizens are represented in the design of tracking technology and the handling of data. Like the decentralized technology approaches outlined above, these data governance proposals would allow us to meet both privacy and public health goals if governments are motivated to listen. 

We can make good decisions now, or bad ones

It’s heartening to see how engineers, lawyers and activists who have long championed privacy have stepped up in creative and constructive ways to answer the question: if we end up building this stuff, how do we make sure it has the right guardrails? 

Much of the thinking and evidence from this work has been summarized in the Exit Through the App Store report that Ada Lovelace Institute released earlier this week. The report shows that we have the technical and policy tools that we need to make good decisions about technologies like digital contact tracing (e.g. use a decentralized tech approach). It also points out that we could easily rush into this and make bad decisions (e.g. use a centralized approach). 

The design decisions we make will have a huge impact whether we move into an era where privacy-by-design and good data governance are the norm, or end up laying the data gathering foundations for the dystopian science fiction future that many of us imagined when we first heard the term ‘contact tracing’ a few short weeks back. 

It’s up to us — and, in particular, our governments — to decide which way we go.

Add a Comment