The UK’s contact tracing app fiasco is a master class in mismanagement
There are advantages to being one of the world’s largest single-payer health-care systems. For the UK’s National Health Service, the NHS, big data is increasingly one of them.
Its Recovery Trial, launched early in the coronavirus outbreak to collect information from across the system, has led to the discovery of dexamethasone as one of the most promising life-saving treatments for the novel coronavirus. In other areas of medicine, its cancer data store, now nearly a decade old, is one of the world’s richest sources of clinical data for cancer research.
So it was hardly surprising that when UK ministers proposed a contact tracing smartphone app, NHS officials saw an opportunity to create a world-leading piece of technology.
But on Thursday the British government announced that it was ditching its original plan in favor of a much simpler backup option—drawing criticism and anger, and leaving many concerned about the prospect of contact tracing technology in general. What happened?
Big data, big ideas
Digital contact tracing—phone-to-phone notifications that can alert users of potential exposure to disease—is a new technology, and the usefulness of such apps to assist track-and-trace efforts is largely untested. But perhaps if the app could also collect information to help track the virus in other ways—looking for patterns in the way the disease spreads, identifying clusters, finding outbreaks early, or even adding demographic and other data—then its potential could be dramatically increased.
This is what motivated officials and developers within the NHS to advocate a centralized model for their app. They believed it could gather the information it had collected on contacts into a protected data store, with the potential to be de-anonymized so people could be alerted if they had come in contact with someone who presented coronavirus symptoms or had received a positive test result.
The centralized approach would allow much more data analysis than decentralized models, which give users exposure notifications but don’t allow officials nearly so much access to data. Those models—such as the one proposed by Google and Apple which is now being used by the NHS—are far less invasive to privacy. The hope is that those privacy protections increase trust in the app, leading more people to use it.
There were other factors that led the UK toward developing a centralized app: its limited testing apparatus and relatively small number of human contact tracers meant that the system might be quickly overwhelmed if it was alerted to every notification of a potential positive case—while a centralized model based on confirmed cases rather than suspected ones was more in line with capacity.
Meanwhile, officials were looking for glory (and even knighthoods), and ministers were focused on rolling out a “world-beating” app, rather than just a successful one, so that they could claim victory on the world stage. The momentum toward a centralized system became unstoppable—and the challenges of building one were largely brushed aside.
Technical trouble—and organizational chaos
Among the many technical obstacles has been the performance of Bluetooth. Nearly all contact tracing apps rely on a phone’s Bluetooth function to track who has been in proximity to whom. In theory, if it running constantly, this can be very accurate, providing reliable results without flooding the health-care system with false positives that could undermine confidence, necessitate thousands of extra tests, and force people to self-isolate needlessly. But in practice, getting accurate results is difficult, and improving their quality has required substantial extra work from app designers across the world.
These systems are being honed and improved, but the UK’s early approach also ignored another important fact: Apple and Google had an existing policy to protect users’ privacy by specifically blocking apps from constantly running Bluetooth scans and sending the data somewhere else—and they were refusing to change the policy for coronavirus apps. Instead, the tech giants were creating their own toolkit to help decentralized apps do something similar, without handing over user data to a central authority.
That left the UK trying to persuade the world’s most powerful technology companies to let it be the exception—or to build an app that specifically circumvented protections Apple and Google had designed, and presumably simply hope they didn’t close whatever backdoors the developers made use of.
Progress on the UK app actually went better than some skeptics thought it would: developers found tricks that helped it sort of work, at least on Android phones. But “sort of” isn’t good enough for a tool intended for widespread deployment during a global health crisis.
So more than a month ago, the UK government quietly commissioned a team to start developing a second app that used the decentralized model. The two competing systems were developed in tandem, at substantial cost.
This coincided with a chaotic series of reorganizations in top management of the UK’s broader track-and-trace efforts. New bosses came in, and the agencies responsible for different parts of the effort were swapped around, all of which left the broader tracing program confused and disconnected: at various points the scripts given to contact tracers didn’t even match those in the apps.
This week, the government made public what was already apparent behind the scenes: the UK would give up on its centralized app in favor of the decentralized backup. It had quietly awarded the operational contract a week earlier.
The UK is no longer trying to be “world-beating”: the aim now is to produce an app with similar functionality to those attempted by other countries.
Given its late arrival and the fact that other countries have had mixed success with their own decentralized tracing apps, it is not clear whether the new system will ever play anything more than a peripheral role in the UK response to the coronavirus.
Costly, confusing mistakes
So what can we learn from all this?
First, that media coverage of the UK’s efforts has often been confused, which in turn muddies attempts to understand what went wrong. Multiple reports have said the UK will now use an “app” developed by Google and Apple, confusing a toolkit for developers with a fully formed app–and therefore missing that the UK already has a decentralized app well under way.
Social media has been more preoccupied with the role of Dominic Cummings, the prime minister’s deeply divisive chief advisor, who holds a strong interest in the use of data in politics after using it to successfully engineer the pro-Brexit Vote Leave campaign.
Multiple viral tweets claim the UK approach was a corporate bid to grab data, whereas those close to the app’s actual development say it was a sincere attempt to use the NHS’s strong track record in data to make the app more useful.
Other critics see the centralized app’s failure in terms of a clash between the UK government and the tech giants—which the tech giants won. Such showdowns are coming, but it’s not clear this was one of them. The UK made no effort to legally compel action from Google or Apple: it asked them to voluntarily weaken their privacy protections. The tech giants, seeing numerous other countries happy to take a decentralized approach, decided to hold their ground. The UK eventually decided the fight wasn’t worth the effort.
These problems are mainly specific to the UK’s situation, which makes it difficult to draw larger conclusions from the precise failures of the NHS approach. But there are still lessons in this failure, even if they are more mundane.
First, the team focused on the potential upsides of a centralized app and initially disregarded all the extra challenges it involved. Outside concerns, many aired publicly, were ignored. The project was then managed chaotically and became the subject of bureaucratic tussles. The result was overspending, wasted effort, and—worse—wasted time.
The stakes for indecision and error are extremely high, especially given that Britain is one of the world’s worst-hit countries, with more than 40,000 confirmed deaths from covid-19 so far. Whether or not the fate of the original plan counts as a strike against digital contact tracing in general, it is clear that the lack of careful, clear communication from the UK authorities has damaged the potential of whatever technologies are now put in place.
The only consolation is that there is an alternative—which means the situation is not quite as much of a fiasco as it could have been. The government could have decided to continue with its problematic, partially usable prototype, and push it to the entire nation despite the many obstacles and concerns. But the NHS saw where things were heading and did start developing a plan B. It didn’t try to roll out the centralized app nationally when it under-performed in its trial.
The headlines today, predictably and deservedly, are terrible for the UK government. It could still have been much worse.
James Ball is global editor at the Bureau of Investigative Journalism and author of Post-Truth and Bluffocracy. His next book, The System: Who Owns the Internet and How It Owns Us, will be published in August 2020.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.