Skip to Content
Silicon Valley

Apple contractors hear confidential details from Siri, a whistleblower claims

Apple's Siri software
Apple's Siri softwareGetty

Those who work on quality control for Apple’s Siri voice assistant “regularly hear confidential details” about users, according to a contractor paid to assess responses to Siri recordings.

The news: The whistleblower told the Guardian these workers routinely hear sensitive information like drug deals, confidential medical details, and people having sex.

Why are they listening in the first place? Just like Amazon and Google, Apple employs people to listen to a sample of recordings from people’s conversations with Siri, transcribe them, and grade the responses according to a set of criteria. These include whether the voice assistant was activated deliberately or not, whether Siri could help with the query, and whether its response was appropriate.

However: Apple, again like Amazon and Google, does not explicitly disclose that it is doing this in its consumer terms and conditions (which are virtually unreadable, anyway). Apple likes to pride itself on being a privacy-conscious company, so this revelation may be more damaging for it than for other firms. Unlike the other two companies, Apple provides no way for users to opt out of their recordings being used this way, other than to just not use Siri at all. Apple told the Guardian that fewer than 1% of Siri recordings are used for training and that they are not associated with a user’s Apple ID.

Do consumers care? There’s been some online outrage about this practice and the fact it’s done without customer consent (and so could be illegal within the European Union), but adoption of voice assistant technology shows no sign of slowing

Keep Reading

Most Popular

The inside story of how ChatGPT was built from the people who made it

Exclusive conversations that take us behind the scenes of a cultural phenomenon.

How Rust went from a side project to the world’s most-loved programming language

For decades, coders wrote critical systems in C and C++. Now they turn to Rust.

ChatGPT is about to revolutionize the economy. We need to decide what that looks like.

New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.

Design thinking was supposed to fix the world. Where did it go wrong?

An approach that promised to democratize design may have done the opposite.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.