Is Facebook Targeting Ads at Sad Teens?
Facebook appears to be using its considerable cache of user data to single out teens—including those who are feeling down—in an attempt to sell ads that target them.
According to Ars Technica, on Monday the Australian obtained a leaked document that outlined the social network’s sales pitch to potential advertisers:
According to the report, the selling point of this 2017 document is that Facebook’s algorithms can determine, and allow advertisers to pinpoint, “moments when young people need a confidence boost.” If that phrase isn’t clear enough, Facebook’s document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” and “a failure.”
The document suggests that ads have been targeted at users as young as 14 years old living in Australia and New Zealand. Facebook did not comment when asked by the Australian whether it employs similar tactics in other countries.
At first, the company issued something of an apology, referencing the document as a “process failure” and mentioning the possibility of “disciplinary” action. But later it issued a brief statement, saying the Australian’s article was “misleading,” and asserting that it does not target ads on the basis of emotional state.
With close to two billion users, Facebook is one of the most powerful Internet services on the planet. It is perhaps unsurprising, then, that controversy related to how it handles user data seems to be ever-present. In 2014, for example, a few of the company’s researchers published a paper in a high-profile journal in which they detailed how they manipulated 700,000 users’ News Feeds with varying degrees of happy or sad content to see how the users reacted. More recently, Facebook was found to be buying up data about users’ offline behavior from third parties, and using that to enhance its understanding of how best to target ads.
As the saying goes, with great power comes great responsibility. But Facebook’s history of tinkering with data suggests that it doesn’t understand the full measure of its obligation to its users. At the very least, it has repeatedly been shown to be stunningly tone deaf as to what a regular person might interpret as creepy and intrusive behavior.
That leads to a question we recently explored in depth: How much is too much power for one company to have?
(Read more: Ars Technica, “How Facebook Learns About Your Offline Life” “We Need More Alternatives to Facebook”)
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.