Apple’s Siri contractors will no longer hear you having sex, making drug deals

Graham Cluley

Apple's Siri contractors will no longer hear you having sex, making drug deals

Apple's Siri contractors will no longer hear you having sex, making drug deals

Google may have been forced by regulators (temporarily at least) to stop its subcontractors listening to audio captured by its smart speaker, but Apple clearly sees the benefit of not waiting until it receives a stern letter from the authorities.

Last week The Guardian reported that – like Amazon and Google – Apple was farming out a small percentage of audio recordings made by its Siri digital assistant to third-parties in order that speech recognition could be improved. And sometimes that can result in personal and potentially sensitive information being exposed:

Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

The Guardian now reports that Apple has decided to suspend what it calls Siri “grading” globally, while it conducts a “thorough review.”

“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

It is to Apple’s credit that they realised they should think again over this, rather than wait for a regulator to hit them with a cricket bat.

As The Guardian reports, Amazon is “the only major provider of voice assistant technology still using humans to check recordings in the EU.” One hopes that Amazon will also swiftly review whether it is handling its customers’ privacy appropriately.

Update: Amazon now lets you opt-out of having humans review your Alexa conversations.

Graham Cluley Graham Cluley is a veteran of the anti-virus industry having worked for a number of security companies since the early 1990s when he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows. Now an independent security analyst, he regularly makes media appearances and is an international public speaker on the topic of computer security, hackers, and online privacy. Follow him on Twitter at @gcluley, or drop him an email.

2 Replies to “Apple’s Siri contractors will no longer hear you having sex, making drug deals”

  1. It would be a truly intelligent robot, indeed, to know when not to record or listen to private affairs, since privacy means something different to every human.

  2. Why don't the executives and managers at Apple offer to have their conversations recorded? After all they have a vested interest in keeping their company profitable.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.