Research suggests that how we type, sleep, and speak may offer clues to cognitive decline
Huge quantities of data reflecting our ability to think and process information are now widely available, thanks to watches and phones that track movement and heart rate, as well as tablets, computers, and virtual assistants such as Amazon Echo that can record the way we type, search the internet and pay bills.
Building on previous studies linking biological markers or changes in certain behaviors to early cognitive decline, researchers and companies are now testing whether machine learning can be used to sift through and make better sense of how this complex data fits together, with the goal being to help clinicians detect diseases such as Alzheimer’s sooner. Dementia is one of the most significant global health concerns, with 75 million people projected to be living with it by 2030, up from 46.8 million in 2015, according to Alzheimer’s Disease International, an international federation of Alzheimer’s associations.
Cognitive changes typically begin years before memory lapses become apparent to individuals or their families. Early detection is difficult because initial changes are subtle, and there aren’t enough dementia experts available to screen people. Yet early detection can be crucial to coming up with the right treatment plan and preserving a patient’s quality of life, says Nina Silverberg, director of the Alzheimer’s Disease Research Centers Program at the National Institute on Aging (NIA).
Having more and different types of data, coupled with better ways to make sense of it, means “there is definitely an opportunity now that we just didn’t have before,” says Dr. Silverberg.
Making doctors better
Much of the current machine-learning research is focused on sifting through patients’ electronic health records to determine what combination of risk factors most accurately reflects cognitive decline.
Clues in Speech
Participants in a large longitudinal study were asked to describe what is happening in this drawing. Analyses of their written responses were compared to a variety of clinical variables to predict dementia risk.
Unimpaired observer
- A young boy is reaching for the cookie jar. He is standing on a stool and is almost falling over. His sister is standing beside him and talking to him.
- On the other side of the kitchen, their mom is wiping dishes. The water from the faucet is running over on to the floor.
- Boy taking cookies
- Mother washing dishes
- water overflowing in sink
- girl getting cookie from boy
- stool falling over
Missing auxiliary verbs, articles, punctuation
More impaired observer
- washing dishes
- getting cookies out of cookie jar
- stool tipping over
- water running out of sink
- Girl reaching for cookie
In addition: misspellings, missing subjects
Source: Elif Eyigoz and colleagues, EClinicalMedicine.
One such study, funded by the NIA and published this year in the Journal of the American Geriatrics Society, examined electronic records of more than 16,000 medical visits of 4,330 participants in a Kaiser Permanente Washington health system. Using a model that identified 31 factors associated with cognitive decline—including changes in the way patients walk, prescription-refill patterns that indicate they aren’t taking medications as prescribed, failure to show up for clinic visits and an uptick in emergency-care use—researchers were able to flag more than 1,000 visits that resulted in a dementia diagnosis, including nearly 500 in which the patient’s cognitive changes previously had gone undetected in the health system. The researchers concluded that if patients with scores in the top 5% of their model were sent for additional evaluation, one in six would be found to have dementia.
The goal of using artificial intelligence in health care isn’t to replace humans but rather to assist doctors, says P. Murali Doraiswamy, professor and director of the Neurocognitive Disorders Program at Duke University School of Medicine. “This isn’t a battle between AI and doctors, it’s about how to optimize doctors’ ability to deliver better care,” he says.
Machine-learning techniques also have been used to predict a patient’s long-term dementia outcome, based on the presence of certain biological markers of disease. In a study published in 2017 in Scientific Reports, researchers used algorithms to identify which clusters of biological markers—such as molecules circulating in spinal fluid and volume in various regions of the brain—indicate a patient has a rapidly progressing case of dementia versus a slower-moving form, a finding that could have significant implications for research and treatment.
“Every touchpoint in this journey can be improved with AI,” particularly in the first stage of screening for memory problems, says Dr. Doraiswamy.
In 2019, drugmaker Eli Lilly LLY -0.49% & Co., technology giant Apple Inc. AAPL 1.54% and health-technology firm Evidation Health Inc. presented data showing that four commonly used devices—a phone, tablet, smartwatch and sleep sensor—could collect useful information over a 12-week period that, when analyzed by a machine-learning algorithm, might allow providers and caregivers to distinguish healthy people from those with mild cognitive impairment or early dementia. The team focused on data that previous research has suggested are indicative of cognitive decline, including gross and fine motor function, circadian-rhythm shifts and language.
The researchers developed “behaviorgrams,” depicting a day in the life of participants across all the signals across all of the data-gathering channels, and found that a machine-learning algorithm could analyze the 40-plus data streams to find differences between cognitively healthy and impaired individuals. The factors most closely associated with early dementia included slower typing speed, a wider variance in wake-up times on consecutive days, and the number of messages sent and received on phones and tablets, a sign of social engagement, according to Luca Foschini, co-founder and chief data scientist at San Mateo, Calif.-based Evidation.
Research has shown that speech patterns—such as slower speech, more pauses, and shorter phrases—also may have promise in identifying patients with mild cognitive impairment. In a recent study published in Current Alzheimer Research, nearly 8,900 individuals were asked to read aloud short sentences, and machine algorithms were able to distinguish between healthy individuals and those with increasing degrees of cognitive impairment by processing the acoustics of speech.
Other studies, such as one published last month in EClinicalMedicine, have analyzed written speech patterns for signs of mental deterioration—like lack of punctuation, misspellings and simplified grammar—and successfully predicted who would go on to develop dementia.
Sharing the results
Much of this research is in the early stages, so many questions remain unanswered. For example, is speech analysis a better indicator of early-stage dementia than visual-information processing? Is more data always better or is there an optimal combination of information? Should human experts guide the formulas and tell the machines which factors to focus on, or allow AI to search the data and develop its own patterns?
Then there is the so-called black-box problem: Because the algorithms themselves are rarely shared, it isn’t clear how a decision is being made. This is particularly important in the health field, where for clinicians the “how” is important to trusting the information, says Seyed-Mahdi Khaligh-Razavi, co-founder and chief scientific officer of Cognetivity Neurosciences, CGN 6.90% a Canadian-based company that has developed an AI-based tool that analyzes the speed at which individuals process visual information to detect early signs of dementia.
What’s more, machines learn based on the data they’re fed, so if that data is biased—say, it comes only from people of only one ethnicity—it could lead to incorrect conclusions, says Dr. Khaligh-Razavi, a neuroscientist.
Another huge debate in the field centers on whether individuals should be able to get the result of such analyses on their own, or whether the information could be given to a clinician first. “There is a lot of turmoil about should we give the information to consumers,” says Evidation’s Dr. Foschini, adding that he is a strong believer that individuals should be the owners of their own data.
The way the results are presented to patients also matters. For instance, to reduce obsessing over the numbers like some people do with weight on a scale, it could be presented to them as a trend over time, rather than a number that can be compared with a patient’s last result, Dr. Foschini says. “The burden of who offers information is to make sure it’s understood by the user,” he says.
Ms. Wang is a writer in London. Email her at [email protected].
Corrections & Amplifications
A study published last month in EClinicalMedicine analyzed written speech patterns to successfully predict who would go on to develop dementia. An earlier version of this article incorrectly said the study successfully differentiated cognitively impaired individuals from those who were healthy. (Corrected on Nov. 3)
Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
Appeared in the November 4, 2020, print edition as 'Finding Early Clues of Dementia.