Christian Delbert - stock.adobe.

ML, NLP Identifies Digital Voice Biomarkers for Alzheimer's

Researchers from UT Southwestern Medical Center used AI to measure subtle changes in speech patterns that may help diagnose cognitive impairment.

Artificial intelligence can help identify subtle changes in a patient’s voice and speech patterns, which may help clinicians diagnose cognitive impairment and Alzheimer’s disease before symptoms appear, according to a study published recently in Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring. 

The study indicates that changes in speech patterns are associated with cognitive decline and that some of these changes may be detectable years before other symptoms begin to show, but capturing them remains a challenge. 

“Our focus was on identifying subtle language and audio changes that are present in the very early stages of Alzheimer’s disease but not easily recognizable by family members or an individual’s primary care physician,” said Ihab Hajjar, MD, Professor of Neurology at the University of Texas (UT) Southwestern’s Peter O’Donnell Jr. Brain Institute, who led the study, in a press release discussing the research findings.  

To do this, the research team used machine learning (ML) and natural language processing (NLP) to analyze the speech patterns of 206 people enrolled in a research program at Emory University, including 114 with mild cognitive decline and 92 deemed cognitively unimpaired. 

From these patients, speech, neuropsychological, neuroimaging, and cerebrospinal fluid-based Alzheimer’s biomarker data were collected. Speech data were gathered via 1- to 2-minute spontaneous recorded descriptions of artwork. 

“The recorded descriptions of the picture provided us with an approximation of conversational abilities that we could study via artificial intelligence to determine speech motor control, idea density, grammatical complexity, and other speech features,” Hajjar explained. 

These recordings were then analyzed using ML, and acoustic and lexical-semantic features were derived. These were then compared to participants’ cerebral spinal fluid samples and MRI scans, allowing the researchers to map connections between the speech pattern changes, or digital voice biomarkers, and other commonly used Alzheimer's biomarkers. 

These comparisons were used to determine how accurately the digital voice biomarkers could detect mild cognitive impairment in addition to Alzheimer’s status and progression. 

Overall, the approach performed well in detecting mild cognitive impairment and identifying participants with evidence of Alzheimer’s, even when such evidence could not be easily detected through the use of standard cognitive assessments. 

Using ML to analyze speech patterns could boost early detection of cognitive decline, the researchers explained, as studying speech patterns in patients is labor-intensive and often unsuccessful because of how such subtle changes in speech are often undetectable to the human ear. 

In this study, the research team spent less than 10 minutes capturing each patient’s voice recording, while traditional neuropsychological tests typically require several hours. 

“If confirmed with larger studies, the use of artificial intelligence and machine learning to study vocal recordings could provide primary care providers with an easy-to-perform screening tool for at-risk individuals,” Hajjar said. “Earlier diagnoses would give patients and families more time to plan for the future and give clinicians greater flexibility in recommending promising lifestyle interventions.” 

Other institutions are also leveraging AI to drive Alzheimer's research and care. 

In an interview with HealthITAnalytics last month, leadership from a new pilot program at Indiana University School of Medicine and Indiana University Health discussed how the project is leveraging AI-based digital screening tools to drive early detection of cognitive impairment in the primary care setting. 

Next Steps

Dig Deeper on Artificial intelligence in healthcare