It’s often said that the eyes are the window to the soul.
Could language—the words we speak and how we speak them—be a window into brain health and risk of cognitive decline? That’s the question that researchers at Mass General Brigham are now exploring with the help of artificial intelligence (AI).
A research team led by neurologists Neguine Rezaii, MD, Brad Dickerson, MD, and colleagues recently conducted a proof-of-concept study demonstrating that two AI models could successfully diagnose patients with early symptoms of Alzheimer’s disease from cognitively unimpaired individuals using voice recordings of a brief storytelling task.
Here are five things to know about the study, which was recently published in npj dementia:
Up to 90% of cases of early-onset Alzheimer’s disease are not detected during primary care appointments. Symptoms are often attributed to other conditions, such as fatigue, sleep deprivation or a psychiatric disorder. Paradoxically, these early stages are the time when the few treatments available for the disease are most effective.
By the time Alzheimer’s disease is eventually diagnosed, most patients have already progressed beyond the stages where they can benefit from treatment.
Language is a promising area for the early detection of Alzheimer’s disease, as it involves memory, executive function and sustained attention.
For this proof-of-concept study, the researchers used digital voice recordings from 120 cognitively impaired patients and 68 cognitively unimpaired controls from the Longitudinal Early-onset AD Study (LEADS) study.
Because LEADS study participants undergo PET imaging to assess amyloid burden in the brain—one of the telltale signatures of Alzheimer’s disease—the researchers knew which of the cognitively impaired patients had early-onset dementia due to Alzheimer’s and which had cognitive impairment due to other brain disorders.
The researchers used two AI-powered approaches to analyze the recordings.
While both models performed comparatively well, the most advanced AI model could identify people with mild cognitive impairment with about 99% accuracy.
The models were also able to correctly distinguish between Alzheimer’s‑related impairment from non‑Alzheimer’s causes with up to 90% accuracy. This could prove to be a major breakthrough, since even trained clinicians struggle to do this early on.

A deeper analysis of the results showed that individuals with cognitive decline due to early-onset Alzheimer’s were more likely to:
- Omit key story details
- Use fewer specifics such as proper names
- Forget times, dates and precise descriptions
- Use shorter, more common words
- Pause more frequently when talking
While individuals with Alzheimer's tended to use more sentences during the storytelling test, those sentences were shorter on average and tended to lack specific details.

The researchers say that one vision for the future would be to incorporate this technology into primary care visits to help overcome the barriers to early Alzheimer’s detection.
Since many clinical visits at Mass General Brigham are now being recorded (with patient consent) by ambient AI tools to help with the transcription of clinical notes, it may be possible to detect the subtle signs of cognitive decline from these recordings, thus eliminating the need for a separate storytelling test.

Neguine Rezaii, MD
For lead author Rezaii, this study is an extension of her long fascination with the processes of human thought and language.
“I used to write in a journal trying to track my trains of thought and the sequences of events that led to me getting distracted,” she explains. “I was also logging my dreams.”
Since both dreams and trains of thought are subjective processes and challenging to measure scientifically, Rezaii shifted her focus to language, believing that a closer look at the words we use could tell us more about how our bodies and brains are functioning.
“If I look at a picture when I am in pain—if I have a headache for example—the words I use to describe it would probably be very different from the words I choose when I am feeling happy and comfortable,” she says. “Even the temperature can affect us. So can we use language to find out what’s going on in someone’s head, body and environment?”
As it turns out, Rezaii’s intellectual curiosity could solve an incredibly pressing clinical need, as the number of Americans with Alzheimer’s is projected to nearly double over the next two decades (from 6.7 million in 2023 to approximately 14 million in 2050), and scientists have struggled to find effective treatments for the later stages of the disease.
“I would love for a test like this to become just like a vital sign check,” she says. “Hypertension is very common and there is an easy way to detect it in a primary care setting. Dementia has a life-time risk of one in seven. That’s a really high risk, and what do we have to detect it? Not much.”
Subscribe to BenchMarks!
Subscribe to our monthly research newsletter and be the first to know what’s shaking in science at Mass General Brigham. From groundbreaking discoveries to the latest “you-heard-it-here-first” breakthroughs, we’ve got the updates that’ll keep your curiosity grooving.

Leave a Comment