Clicky

Defend free speech and individual liberty online. 

Push back against Big Tech and media gatekeepers.

Invasive tech analyzes your voice for signs of mental illness

Call centers are already testing it.

If youโ€™re tired of censorship and surveillance, join Reclaim The Net.

Sonde Health, Kintsugi, Winterlight Labs, and Ellipsis Health are selling โ€œvoice biomarkerโ€ technology to identify anxiety, depression, and other conditions. The technology is being used at medical clinics and call centers.

The few available studies into the technology claim that voice biomarkers can help detect a myriad of health problems, including depression, respiratory illnesses like asthma and Covid, and cardiovascular conditions. However, the technology does raise ethical and privacy concerns.

Insurance companies and hospitals are installing the voice biomarker technology at their call centers and after getting the patientโ€™s consent, they can detect in real-time if the patient has a mental health condition like depression or anxiety.

Unlike voice assistants like Alexa and Siri, voice biomarker software analyzes how you talk, not what you are saying. Your voice is analyzed by a machine-learning system that matches it against anonymized voice samples.

According to psychologist and assistant professor at the University of Cincinnati Maria Espinola, people suffering depression โ€œtake more pausesโ€ and stop more often when they are talking.

She added: โ€œTheir speech is generally more monotone, flatter, and softer. They also have a reduced pitch range and lower volume.โ€

Speaking to Axios, Kintsugi CEO Grace Chang said: โ€œFrom as little as 20 seconds of free-form speech, weโ€™re able to detect with 80% accuracy if somebody is struggling with depression or anxiety.โ€

โ€œWhen weโ€™re integrated into a call center where there is a nurse on the line, the nurse can ask additional questionsโ€ she added. She further claimed that 80% of patients agree to have their voices analyzed.

Some experts have raised ethical concerns and feel the technology could be misused.

โ€œThere is still a long way to go before AI-powered vocal biomarkers can be endorsed by the clinical community,โ€ reads an editorial published by medical journal The Lancet. โ€œBetter ethical and technical standards are required for this research to fully realize the potential for vocal biomarkers in the early detection of disease.โ€

Others feel the technology could exacerbate systemic biases, as is the case with facial recognition technology (it is more inaccurate when identifying people of color and women).

If youโ€™re tired of censorship and surveillance, join Reclaim The Net.

Reclaim The Net Logo

Defend free speech and individual liberty online. 

Push back against Big Tech and media gatekeepers.