The online vigilantes solving local crimes themselves
Online groups of neighbors love gossiping about local misdeeds. But are they helping or hurting?
One evening last summer, my family was enjoying a picnic in the park near our house in London when two dogs attacked our blind 15-year-old Jack Russell terrier, Zoey. They pounced on her, locking their jaws. As my husband threw himself on the dogs, I begged the owner to intervene. He refused—until he realized I was calling the police. Only then did he restrain his animals, one of which had started to chase my four-year-old daughter. A few hours later Zoey was dead, leaving us devastated.
We felt even worse when the police didn’t attempt to track down the owner of the killer dogs, despite having images from my phone to go on. In the eyes of the UK justice system, Zoey’s killing was a low-level crime because an animal, rather than a human, had died. The realization galvanized us: if the police wouldn’t find the culprits, then we would.
Increasingly, communities are turning to technology to help solve problems that the police are unable—or unwilling—to attend to. So that’s what I did: I went online, joining an increasing number of people who are using local networks to solve crimes that have affected them, such as robberies, reckless driving, and even plant theft.
One in 10 posts on the neighbor-networking site Nextdoor is related to crime and policing matters. I had nearly 800 neighbors on that platform and was also in several neighborhood groups on Facebook, whose members totaled 74,000. In all, my description of the attack on Zoey was shared hundreds of times. By circulating information about it, my neighbors and I were participating in a ritual that is modern only in terms of the technology it now relies on.
In the UK, as in other places, collective action is filling a gap left by a diminishing police presence. A significant reason for this is that budget cuts have forced a decline of nearly 23% in the police workforce, according to Unison, the country’s largest union. London’s Metropolitan Police has been the worst affected, with over 3,000 jobs lost between 2012 and 2016. This includes 3,350 jobs for community support officers—a role created specifically to make the police more visible. These officers had been brought in to work with the community, says Menaal Munshey, a criminologist with the United Nations. “But because of the cuts, that link has been broken. And the community feels like it’s on its own.”
That frustration is likely why anonymous tipsters opted to reach out to me, a complete stranger, rather than go to the police. Previous appeals to the police had apparently fallen on deaf ears.
Of course, such information sharing isn’t always a good thing. A study published last year by Dutch academics Ronald van Steden and Shanna Mehlbaum confirmed what is already observed: neighborhood groups have “undesirable social and moral by-products” such as discrimination, stigmatization, exclusion of strangers, and excessive social control. “If people are constantly encouraged to be aware of anything and anyone ‘out-of-the-ordinary’, such a process may slowly but surely open the doors for harsh surveillance practices to creep into people’s normal lives. This, in turn, stimulates the erection of a digital pillory, a witch-hunt for (assumed) paedophiles, exclusive forms of ‘stranger danger’ and other potential for voyeuristic mob activism. It is not difficult to recognise that democratic values of openness, tolerance and mutual respect are at stake here.”
The problems, when they do arise, aren’t confined to any one neighborhood, city, or country, and they are often in response to cultural fissures that were already present. In India, for example, unfounded rumors that circulated on local WhatsApp groups in 2018 fed old fears of a specific kind of bogeyman that Indians have grown up hearing about: a bacha chor, a person who kidnaps children to harvest their organs. The rumors led to the murder of at least two dozen people in different parts of the country and forced WhatsApp to limit the number of times that users in India could forward a message. The challenge for Nextdoor in the United States, meanwhile, is that it has become a magnet for racial profiling. In 2015, Nextdoor addressed the issue with changes that included asking users who mentioned race in their posts to provide additional details.
The reality is that the root causes of these crimes continue to go unaddressed. It isn’t just police resources that need to be reassessed, but social welfare programs such as gang mediation, drug and alcohol treatment, and children’s services, all of which have also fallen victim to governmental service cuts. And while neighborhood groups can have a positive impact on social cohesion, there is no proof that they actually reduce crime. Danielle Pyke, a police community support officer with the Met, says it’s rare for Nextdoor users to provide information that leads to patrols, arrests, or drug busts.
When online groups do work, by mobilizing people to share information, they can be a success. Thanks to the information I gathered with the help of my community, my family and I were able to submit a dossier to the police, forcing them to act. The owner of the animals that mauled Zoey was charged with two counts of owning dangerously out-of-control dogs that had caused injuries, and he was ordered to appear in court in April.
Unfortunately, however, on the evening before the case went to trial, we received a call from the prosecuting lawyer informing us that she had no choice but to close the case because police had failed to submit the paperwork required. The failure of the Met to do basic admin, despite being given several months, denied my family our day in court and Zoey the justice she deserved.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Video: Geoffrey Hinton talks about the “existential threat” of AI
Watch Hinton speak with Will Douglas Heaven, MIT Technology Review’s senior editor for AI, at EmTech Digital.
Doctors have performed brain surgery on a fetus in one of the first operations of its kind
A baby girl who developed a life-threatening brain condition was successfully treated before she was born—and is now a healthy seven-week-old.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.