An AI system originally developed at the Massachusetts Institute of Technology (MIT) to research speech patterns in people with Alzheimer’s was repurposed when the COVID pandemic hit to become an indicator for COVID in asymptomatic patients.
“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs,” says research scientist Brian Subirana of MIT.
“This means that when you talk, part of your talking is like coughing, and vice versa. It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough.”
The system was based on a neural network that was trained on a thousand hours of human speech, then on a database on words spoken in different emotional states and finally a database of coughs. The result was a system that could detect a cough in an asymptomatic person with COVID with 97.1% accuracy. However, this is not a true test of COVID but an enhanced early indicator. The advantage of this technology is that it can be developed as an early warning system that can be incorporated into something as ubiquitous as a smartphone.
Source: Science Alert