Ann Reardon is probably the last person whose content you’d expect to be banned from YouTube. A former Australian youth worker and a mother of three, she has her own cookbook, has baked for the BBC, and once made a coin-size apple pie for two baby chicks. Since 2011 she’s been using her YouTube channel to show millions of loyal subscribers how to bake and decorate elaborate cakes.
But on July 1, Reardon woke to an email from YouTube that said her latest video had been removed. “Our team has reviewed your content and unfortunately we think it violates our harmful and dangerous policy,” it read.
The removal email was referring to a video that was not Reardon’s typical sugar-paste fare.
Instead, “Debunking DEADLIEST craft hack, 34 dead,” was the latest in an offshoot series on Reardon’s channel: since 2018, she has used her platform to warn viewers about dangerous new “craft hacks” that are sweeping YouTube. The baker has uploaded 28 videos tackling unsafe activities such as poaching eggs in a microwave, bleaching strawberries, and using a Coke can and a flame to pop popcorn.
On this occasion, she had been caught up in the inconsistent and messy moderation policies that have long plagued YouTube. Reardon’s video exposed a failing in the system: How can a warning about harmful hacks be deemed dangerous when the hack videos themselves are not? On paper, the platform bans videos that encourage “dangerous or illegal activities that risk serious physical harm or death.” But does it do so in practice?
In her 14-minute video, Reardon warned her viewers against a crafting technique that can be deadly if it goes wrong. “Fractal wood burning” involves shooting a high-voltage electrical current across dampened wood to burn a twisting, turning branch-like pattern in its surface. On its website, the American Association of Woodturners says it poses “significant hidden risk of electrocution.” In the video Reardon complained that multiple tutorials on fractal wood burning were available on YouTube, “including how-to videos that show you how to make your own fractal wood burning device using parts from an old microwave.”
Reardon doesn’t just want to raise awareness—she wants YouTube to change. “If I had my way, YouTube would make a policy against dangerous hacks and dangerous how-to videos,” she says at the end of one recent video. “They’ve got one against dangerous pranks and dangerous challenges—why isn’t there one for dangerous hacks?”
YouTube told MIT Technology Review that it re-reviewed and reinstated Reardon’s video soon after she appealed the ban; the video was back up by July 2. The company said it frequently reinstates videos that are mistakenly removed. Yet it is unclear why Reardon’s video fell foul of YouTube’s dangerous-content policies while the wood-burning videos she warned against remained available to watch.
When MIT Technology Review first approached YouTube about the issue, there were more than 3,000 Google search results for fractal wood burning videos on YouTube. Now there are just over 1,000. A YouTube spokesperson said: “Under our Harmful or Dangerous Content policies, we prohibit content that encourages dangerous or illegal activities that risk serious physical harm or death. Upon review, we removed a number of videos and applied appropriate age restrictions to content that is not suitable for all viewers.”
The egg was bigger than before. On July 25, 2019, a Twitter user clipped and shared an unusual video watermarked with the words “5-Minute Crafts.” In the 55-second clip, an egg was placed in a wine glass full of vinegar, and a caption instructed: “Wait one day.” The egg emerged yellow and bouncy, and a caption declared: “Bigger than before.”
The bouncy egg was placed in a glass of maple syrup. Then it was placed in water that had been dyed blue. Once again the captions said, “Wait one day”—followed by: “Bigger than before.”
The baffling clip went viral, earning 72,000 likes on Twitter and coverage in New York magazine. With that, the wider world was alerted to the existence of 5-Minute Crafts, a six-year-old YouTube channel that has now accumulated 24 billion total views. Almost every 5-Minute Crafts video is as bizarre and nonsensical as the egg that was bigger than before. The channel shows people putting contact lenses in with cotton buds, peeling apples with a drill, crafting makeshift soldering irons out of lighters, and applying toothpaste to burns (Colgate’s official website warns against the practice).
Reardon had first learned about 5-Minute Crafts a year earlier, when her viewing figures fell sharply in the wake of an algorithm change at YouTube. Popular YouTubers are often allocated a “partner manager” at the company who offers one-on-one support; Reardon reached out to hers to express her concern at her declining numbers.
“He suggested having a look at some of the channels that were doing well under the new algorithm,” Reardon says, “And that’s when I realized: hang on a minute, you can’t do some of these recipes. They’re not real recipes; they’re fake.” In December 2018, Reardon uploaded a video testing out baking hacks from the food hack channel So Yummy and demonstrated that despite the channel’s claims, you cannot whip ice cream and sugar into cake frosting or melt gummy bears into jelly. In July 2019, she criticized the YouTube channel Blossom for posting similar misinformation.
“I got comments from young kids going, ‘I thought that I couldn’t cook. I tried that video and it didn’t work and Mum said I can’t cook now because I’ve wasted ingredients,’” Reardon says. She decided to use her undergraduate degree in food science and her postgraduate degree in dietetics to begin debunking more clips—but she quickly realized that many hacks weren’t just fake but actually dangerous.
In May 2019, Reardon released a video about 5-Minute Crafts. She gasped at an actor putting hot glue on a toothbrush and tested out a recipe for “gritty” activated charcoal ice cream, but she dedicated a large chunk of video to a clip in which strawberries were added to bleach. “If some kids actually make this at home and eat these white strawberries, that’s going to poison them,” she said, before asking her viewers to report the video (the clip has since been removed from the 5-Minute Crafts channel).
Blossom and So Yummy did not respond to a request for comment. Technology Review sent TheSoul Publishing, the company behind 5-Minute Crafts, a list of concerning videos on its channel, including a tutorial on spinning molten sugar into cotton candy with an electric drill; a tutorial on making a glue gun out of a sliced soda can and a lighter; and a video in which a mysterious hand lights antibacterial gel on fire before swiping fingers through it.
Patrik Wilkens, VP of operations at TheSoul Publishing, said the company “produces enjoyable, positive, and original content that is not intended to be a resource for fact-finding, but rather a source of entertainment.” YouTube said it would review the 5-Minute Craft videos flagged by Technology Review.
A warning in the description of every 5-Minute Crafts upload reads: “The following video might feature activity performed by our actors within controlled [sic] environment—please use judgment, care, and precaution if you plan to replicate.”
Wilkens said TheSoul Publishing has a “quality assurance” team who review every video throughout its production, “and we adhere to the policies of the platforms where our videos appear.” He added, “Additionally, on a daily basis, we monitor and collect feedback from audiences and partners, making necessary changes and improvements.”
On September 5, 2019, a Chinese teenager died after allegedly attempting to copy a viral hack video. The video, uploaded by cooking influencer Ms Yeah, taught viewers how to pop popcorn inside a soda can placed above an alcohol lamp. The family of a 14-year-old identified only as Zhezhe said she and her 12-year-old friend Xiaoyu were trying to follow the video instructions when the can exploded. Both girls were severely burned, and Zhezhe died from her injuries.
Ms Yeah, whose real name is Zhou Xiao Hui, paid the families an undisclosed amount of compensation but denied that the girls were copying her video, as they had reportedly heated up alcohol directly inside two cans. “I used only one tin can and an alcohol lamp, which is safer,” she wrote on Weibo. She added that her videos are not meant to be instructional. The Ms Yeah YouTube channel has 11.7 million subscribers who watch Zhou cook in unusual ways, often with office equipment. She has barbecued meat on a filing cabinet, spun cotton candy on an electric drill, and fried food inside an oil-filled coffee pot. Ms Yeah did not respond to a request for comment.
Apart from this incident, Reardon has shed light on egg-poaching hacks that have left a number of people injured. There are tens of thousands of YouTube videos about poaching eggs in the microwave, many of which are user generated. Microwaving eggs can cause them to explode, and researchers have found that microwaved yolks are an average of 22 °F hotter than microwaved water. In the last three years, multiple people in the UK have burned themselves attempting to do this.
Deaths or serious injuries from craft and cooking hacks are still relatively rare. But fractal wood burning is different.
Reardon first became aware of fractal wood burning after a Wisconsin couple died attempting the craft this April. But the practice has been popular for a number of years. The American Association of Woodturners has counted 33 US deaths from fractal wood burning since 2016, but the total is likely higher, because the organization only counts deaths that make the news. A 2020 paper by doctors from a burn hospital in Oregon found a 71% mortality rate after accidents involving fractal wood burning; the paper’s authors called this rate “stunningly high.”
In May 2020, Matt Schmidt, a construction worker, was electrocuted trying fractal wood burning in his garage. His wife, Caitlin Schmidt, then a nurse, was at work, and her oldest son was the one to find his father’s body.
“The problem is that literally anybody can watch these videos—kids, adults, it doesn’t matter,” she says. Matt first saw a fractal wood burning video shared by a friend on Facebook and was so intrigued that “he started watching YouTube videos on it—and they’re endless.”
Matt was electrocuted when a piece of the casing around the jumper cables he was using came loose and his palm touched metal. “I truly believe if my husband had been fully aware [of the dangers], he wouldn’t have been doing it,” Schmidt says. Her plea is simple: “When you’re dealing with something that has the capability of killing somebody, there should always be a warning … YouTube needs to do a better job, and I know that they can, because they censor all types of people.”
After Matt’s death, medical professionals from the University of Wisconsin wrote a paper entitled “Shocked Though the Heart and YouTube Is to Blame.” Citing Matt’s death and four fractal wood burning injuries they’d personally treated, they asked that “a warning label be inserted before users can access video content” on the crafting technique. “While it is not possible, or even desirable, to flag every video depicting a potentially risky activity,” they wrote, “it seems practical to apply a warning label to videos that could lead to instantaneous death when imitated.”
Matt and Caitlin Schmidt had been best friends since they were 12 years old. He leaves behind three children. Schmidt says that her family has suffered “pain, loss and devastation” and will carry lifelong grief. “We are now the cautionary tale,” she says, “and I wish on everything in my life that we weren’t.”
YouTube told MIT Technology Review its community guidelines prohibit content that’s intended to encourage dangerous activities or has an inherent risk of physical harm. Warnings and age restrictions are applied to graphic videos, and a combination of technology and human staff enforces the company’s guidelines. Dangerous videos banned by YouTube include challenges that pose an imminent risk of injury, pranks that cause emotional distress, drug use, the glorification of violent tragedies, and instructions on how to kill or harm. However, videos can depict dangerous acts if they contain sufficient educational, documentary, scientific, or artistic context.
YouTube removed “a number” of fractal wood burning videos and age-restricted others when approached by MIT Technology Review. But the company did not say why it moderates against pranks and challenges but not hacks.
It would certainly be challenging to do so—each 5-Minute Crafts video contains numerous crafts, one after the other, many of which are simply bizarre but not harmful. And the ambiguity in hack videos—an ambiguity that is not present in challenge videos—can be difficult for human moderators to judge, let alone AI. In September 2020, YouTube reinstated human moderators who had been “put offline” during the pandemic after determining that its AI had been overzealous, doubling the number of incorrect takedowns between April and June.
When a YouTube video is age-restricted for portraying dangerous or illegal activities, the video may—according to Google’s Support pages—“have limited or no ads monetisation.” 5-Minute Crafts is currently the 13th most subscribed channel on YouTube; every week, the channel gains around 30 million more views. Ms Yeah has 11.7 million subscribers and nets a similar number of weekly views.
Shocking or questionable videos are a surefire way to collect eyeballs, and generate profit, on YouTube. When a video is bizarre, it’s harder to click away; when it’s outrageous, you voice your outrage in the comment section. According to Social Blade, a site that tracks social media analytics, the 5-Minute Crafts channel makes anywhere between £360,000 and £5.8 million a year.
TheSoul Publishing, which has more than 1 billion subscribers across all its channels, said that as a private company it would not disclose how much it makes from its craft hack videos. Wilkens denied that the company deliberately creates shocking and questionable videos, saying: “This is not now, nor has it ever been, a part of TheSoul Publishing’s business model. As a leading digital creator, we strive to make content that is the most appealing to the most people—which is the goal of mostly every content producer, advertiser, streaming service, and movie studio.”
Bertie Vidgen, head of the Online Harms Observatory at the Alan Turing Institute, says it is “shocking” that YouTube has not put warnings on fractal wood burning videos. “If people have died from trying to do this, then that’s almost beyond question—there clearly is a risk of harm,” he says.
A fractal wood burning YouTube Short (a video less than 60 seconds long) with 21 million views remains up In it, a pair of gloved hands brush water and baking soda onto some wood before attaching clamps and wires to two nails. The wood begins to burn. YouTube has removed the full video that the Short links to, but the Short has no warning disclaimers. The comment section, though, is full of warnings, some of which start with the words “I came here after watching Ann Reardon’s video.”
“I think there needs to be something put in place that is a clear warning to people,” Reardon says. Since starting her debunking series, she’s received thank-you emails from parents who’ve shown her videos to their kids, and even from kids who’ve shown her videos to their parents. “I feel like if nothing changes,” she says, “then it’s important to raise awareness.”
Humans and technology
VR is as good as psychedelics at helping people reach transcendence
On key metrics, a VR experience elicited a response indistinguishable from subjects who took medium doses of LSD or magic mushrooms.
The 1,000 Chinese SpaceX engineers who never existed
LinkedIn users are being scammed of millions of dollars by fake connections posing as graduates of prestigious universities and employees at top tech companies.
Social media is polluting society. Moderation alone won’t fix the problem
Companies already have the systems in place that are needed to evaluate their deeper impacts on the social fabric.
The fight for “Instagram face”
Meta banned filters that “encourage plastic surgery,” but a massive demand for beauty augmentation on social media is complicating matters.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.