Siri security: contractors are listening

CUPERTINO, Calif. (NBC News) – Apple has long touted its commitment to protecting customer data, but according to a new report from The Guardian, humans are listening in on conversations with Apple’s voice assistant Siri.

According to an anonymous whistleblower, Apple contractors regularly hear recordings with medical information, drug deals and even intimate moments.

“A lot of companies often have monitoring just to see how the voice assistants are working,” explains CNET’s Roger Cheng.

Apple hasn’t denied the allegations, but emphasizes Siri recordings are anonymized and says “only a very small random subset, less than one percent of daily Siri utterances, are used” for analysis.

Experts say you can avoid accidental or unwanted recordings by making adjustments to the features on Apple watches, Homepods or iPhones that make Siri easy to activate.

In iPhone settings, you can toggle off the option that allows Siri to listen for the wake word when the screen is locked, or disable Siri completely.

“By limiting the number of triggers, you can protect yourself,” Cheng says.

Other tech companies also analyze user conversations to improve voice assistants.

Both Google and Amazon have been criticized for similar practices, but those companies say users can access and delete all voice assistant inquiries on their accounts.

More: https://on.today.com/2JZPyXs

© 2024 KOBI-TV NBC5. All rights reserved unless otherwise stated.

Skip to content