Sonde Health, Kintsugi, Winterlight Labs, and Ellipsis Health are selling “voice biomarker” technology to identify anxiety, depression, and other conditions. The technology is being used at medical clinics and call centers.
The few available studies into the technology claim that voice biomarkers can help detect a myriad of health problems, including depression, respiratory illnesses like asthma and Covid, and cardiovascular conditions. However, the technology does raise ethical and privacy concerns.
Insurance companies and hospitals are installing the voice biomarker technology at their call centers and after getting the patient’s consent, they can detect in real-time if the patient has a mental health condition like depression or anxiety.
Unlike voice assistants like Alexa and Siri, voice biomarker software analyzes how you talk, not what you are saying. Your voice is analyzed by a machine-learning system that matches it against anonymized voice samples.
According to psychologist and assistant professor at the University of Cincinnati Maria Espinola, people suffering depression “take more pauses” and stop more often when they are talking.
She added: “Their speech is generally more monotone, flatter, and softer. They also have a reduced pitch range and lower volume.”
Speaking to Axios, Kintsugi CEO Grace Chang said: “From as little as 20 seconds of free-form speech, we’re able to detect with 80% accuracy if somebody is struggling with depression or anxiety.”
“When we’re integrated into a call center where there is a nurse on the line, the nurse can ask additional questions” she added. She further claimed that 80% of patients agree to have their voices analyzed.
Some experts have raised ethical concerns and feel the technology could be misused.
“There is still a long way to go before AI-powered vocal biomarkers can be endorsed by the clinical community,” reads an editorial published by medical journal The Lancet. “Better ethical and technical standards are required for this research to fully realize the potential for vocal biomarkers in the early detection of disease.”
Others feel the technology could exacerbate systemic biases, as is the case with facial recognition technology (it is more inaccurate when identifying people of color and women).