Clicky

Leaked document suggests Apple to introduce “health” conversations with Siri, following listening data scandal

Not only will Siri make house calls - the report assumes that the virtual assistant will be able to answer questions about both physical and mental health.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Health-related data is among the most sensitive private information out there, and Apple’s record, when it comes to its virtual assistant Siri and privacy, is about as bad as that of other similar services.

In July, reports revealed that Apple was allowing humans, employed as contractors and known as “graders,” to listen to audio files collected through Siri, including those of confidential and personal nature that users were unaware was being recorded – when the assistant was activated accidentally, which is something all too easy to do.

Apple then apologized – and promised that come fall, the grading project will resume and become opt-in, and that those who decide to participate can rest easy knowing that their personal conversations and activities will be listened to by Apple employees, rather than contractors.

At the time, Apple also fired hundreds of contractors. Now a former “grader” revealed Apple’s plans to use Siri to tap deeper into health-related data of its users – and what could possibly go wrong?

The Guardian reports that the new feature is planned to be introduced in two years, with iOS 15.

It will let users have “a back-and-forth conversation about health problems.” Not only will Siri make house calls – the report assumes that the virtual assistant will be able to answer questions about both physical and mental health.

Apple is pushing hard into the health data market even now, with a variety of apps and features that monitor users’ conditions and have access to their health records from hospitals.

And while Apple will be happy to handle your health data in the future, it is working hard to make sure Siri doesn’t provide any even remotely controversial answers to some sensitive and divisive topics, especially in the US.

The Guardian writes that this includes questions related to feminism and gender equality, that were in the past treated more “dismissively.” Now the leaked guidelines from last year advise developers to look for ways to deflect, and above all, be neutral. The report dubs these changes as “sensitivity rewrites.”

Apple said in a statement: “Our approach is to be factual with inclusive responses rather than offer opinions.”

But how about this “fact” also found in the guidelines: “Siri’s true origin is unknown, even to Siri; but it definitely wasn’t a human invention.”

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share