Apple Admits a Small Portion of Siri Recordings Are Heard by Humans

Accidental activations of Apple's Siri, where the voice assistant mistakenly hears its wake word, are often fraught with confidential information.

Advertisement
By Gadgets 360 Staff | Updated: 29 July 2019 10:20 IST
Highlights
  • Siri recordings with confidential info reportedly heard by third parties
  • Apple has responded to these claims based on a contractor account
  • Contractors say no privacy guidelines are issued to employees

Siri listening on the Apple HomePod

Apple allows Siri recordings to be heard by contractors as part of a process called "grading", which improves the efficacy of the voice assistant, a report claims. This frequently includes confidential information, such as medical history, sexual interactions, and even drug deals, a whistleblower working for one of the contractors is cited to say. The report notes that Apple doesn't explicitly note this in its consumer-facing privacy documentation. Apple has responded to the report, confirming that a small portion of Siri recordings is indeed used for improvements.

The news comes at a time when Amazon and Google, both which also offer voice assistant services, have admitted third parties have access to some voice details. Unlike them, however, Apple has built and enjoys a reputation of safeguarding the privacy of its users.

The report's claims

The Guardian cites a whistleblower at one of the contractors allegedly working for Apple to claim the Cupertino-headquartered company releases a small proportion of Siri recordings to such contractors. These contractors are expected to grade the responses on numerous factors, such as "whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri's response was appropriate."

Advertisement

Accidental activations of Siri, where the voice assistant mistakenly hears its wake word, are often fraught with confidential information, the whistleblower adds.

Apple says Siri recordings are used "to help Siri and dictation"

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data," the whistleblower is quoted to say.

Advertisement

While Siri is most often associated with iPhone and Mac devices, the contractor claims the Apple Watch and HomePod are in fact the most common sources of accidental activations.

"The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds - not that long but you can gather a good idea of what's going on," the whistleblower adds.

Advertisement

Staff are encouraged to treat recordings of accidental activations as a "technical problem", but no procedure was said to be in place to deal with sensitive information. The contractor alleges that employees are expected to hit targets as fast as possible. The report adds that the whistleblower's motivation for disclosure were based on fears of such data being misused, as there purportedly is not much vetting on who works with the data, a high turnover rate of employees, no proper guidelines about privacy, and the possibility to identify the users.

"It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers - addresses, names and so on," the whistleblower added.

Advertisement

Finally, the report claims Apple doesn't explicitly mention Siri recordings are made available to humans, not just those that directly work for it but even contractors. The recordings are said to be made available with pseudonymised identifiers. The whistleblower emphasises that the company should especially remove the patently false "I only listen when you are talking to me" Siri response to the query "Are you always listening?"

Apple's response

In response to The Guardian report, Apple said Siri recordings are used to "help Siri and dictation... understand you better and recognise what you say."

It adds, "A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements." The Cupertino company is also cited to say that less than 1 percent of daily Siri activations, and only a random subset, are used for grading. These recordings are usually only a few seconds long, the company is reported to add.

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement

Related Stories

Popular Mobile Brands
  1. Junior OTT Now Streaming Online: What to Know About Sreeleela and Kireeti Reddy's Romantic
  1. SpaceX Launches NROL-48 for NRO’s Proliferated Satellite Architecture
  2. Bizarre New Computer Mouse Designs Aim to Cut Wrist Injuries, Scientists Say
  3. Artemis 2 Orion Capsule Named “Integrity” for Upcoming Moon Flyby
  4. SpaceX Launches IMAP, CGO, SWFO-L1 to Probe Solar Frontier and Space Weather
  5. Study Reveals How Humans Touch Unfamiliar Objects, Shaping Human–Robot Interaction Research
  6. NASA Targets February 2026 Window for Historic Artemis 2 Moon Mission
  7. NASA’s Chandra Finds Black Hole Growing Beyond Known Limits
  8. Earth’s Oxygen Explains Mysterious Rust Formation on the Moon
  9. Maatonda Heluve Now Streaming Online: Know Everything About Cast, Plot, and More
  10. Tulsa King Season 3 Now Streaming Online: Know Everything About Sylvester Stallone Action Series
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.