Clicky

Some Siri recordings are sent to Apple contractors who hear sensitive information

This follows reports of Google contractors being able to hear confidential details when reviewing Google Assistant and Google Home recordings.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

A whistleblower has revealed that Apple contractors listen to some recordings made by Apple users when they interact with the Siri voice assistant on their Apple devices and that these recordings often contain private, confidential information, or even illegal information.

According to the whistleblower, these contractors regularly hear private discussions which include:

  • Seemingly criminal dealings
  • Couples having sex
  • Doctor-patient conversations
  • Business conversations

The whistleblower adds that these recordings are accompanied by other sensitive user information to determine whether the Siri query was successfully dealt with. This information includes:

  • The location of the user
  • The user’s contact details
  • The user’s app data

The whistleblower also said that accidental Siri activations are the most common reason for these sensitive recordings being sent to Apple and that the Apple Watch and HomePod smart speaker are the source of most accidental Siri activations, with the accidental triggers via the Apple Watch being “incredibly high.”

The contractors who listen to the recordings are tasked with grading the recordings based on factors which include:

  • Whether Siri was activated intentionally or accidentally
  • Whether Siri could be expected to help with the query
  • Whether Siri’s response was appropriate

In addition to providing details on how Apple contractors grade these often sensitive recordings, the whistleblower said that the Siri response which claims “I only listen when you’re talking to me” is “patently false” given the frequency of accidental triggers.

The whistleblower said that they came forward because they’re concerned Apple’s lack of disclosure around how these voice recordings are graded and the frequency of sensitive information being accidentally recorded. They added that Apple doesn’t have a specific procedure in place for reporting or dealing with sensitive recordings.

Apple has not denied any of the claims from the report and responded by saying:

“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Apple added that less than 1% of daily Siri recordings are graded by these contractors and that the recordings used are often only a few seconds in length.

Apple’s response is surprising given that it often positions itself as the most privacy-focused of the big tech firms yet hasn’t denied that user location data, contact data, or app data is sent with these recordings – private information which could be used to identify people in the recordings. Apple also doesn’t mention any safeguards it has in place to protect against accidental Siri activations.

When similar reports surfaced about the Google Assistant and Google Home smart speaker earlier this month, Google admitted that around 0.2% of these recordings were listened to by human contractors – a much smaller proportion than the 1% of recordings that are listened to by Apple contractors. Additionally, Google said that it has safeguards in place to prevent accidental activations and emphasized that audio snippets are not associated with user accounts.

Amazon also responded to revelations about its employees and contractors listening to Alexa voice assistant recordings by emphasizing that employees do not have access to information that can identify the person associated with the recording.

Since Apple has used privacy as a marketing tool to differentiate itself from Amazon and Google, these revelations about Siri are likely to be quite damaging.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share