Baby boomers are experiencing a sharper drop in cognitive function as they age, relative to previous generations, according to a recent study. The findings not only suggest that boomers will be more likely to develop conditions like dementia than past cohorts, but future aging generations may be at a similar heightened risk.
The study, published in The Journals of Gerontology: Series B late last month, looked at the cognitive test scores of over 30,000 Americans over the age of 50 who were enrolled in an existing, long-running research project by the University of Michigan, called the Health and Retirement Survey. As part of the project, volunteers were asked interview questions meant to evaluate their cognitive function every two years. Questions included counting backwards from 100 in increments of 7 and having to recall the name of recent objects they were shown. All told, the study analyzed nearly 20 years worth of test scores, collected from 1996 to 2014.
Though people do generally lose a step or two in their brain power as part of the normal aging process, the study found there was an improving trend in cognitive function across generations born before and during World War II. The study had data for the following cohorts: Greatest Generation (born 1890-1923); Early Children of Depression (born 1924-1930); Late Children of Depression (born 1931-1941); War Babies (born 1942-1947); early baby boomers (born 1948–1953); and mid baby boomers (born 1954–1959).
While each generation before the boomers had improved later-life cognition compared to the one before it, the boomers showed a decline compared to war babies, breaking the pattern of improvement.
“It is shocking to see this decline in cognitive functioning among baby boomers after generations of increases in test scores,” study author Hui Zheng, professor of sociology at The Ohio State University, said in a statement released by the university. “But what was most surprising to me is that this decline is seen in all groups: men and women, across all races and ethnicities and across all education, income and wealth levels.”
Zheng also tried to account for age-related declines in cognition by looking only at the scores of people in their early 50s. But again, early baby boomers in their 50s on average had lower test scores than did people from earlier generations in that same age group. That likely means that whatever is causing this drop in cognition, the decline started becoming apparent by the time baby boomers were still middle-aged.
This type of study can’t show what might be behind the drop in cognitive function. But Zheng did try to account for possible factors that could have influenced these trends in his analysis. Improvements in childhood nutrition and health throughout the 20th century probably help explain why pre-World War II generations started to have better cognition than the preceding generation, at least in the data Zheng had available. But baby boomers had these advantages and more, since they also experienced overall gains in education and working conditions. At the same time, baby boomers in general were probably exposed to higher rates of other factors linked to declining cognitive function.
“The underlying causes include lower wealth, lower likelihood of being married, higher levels of loneliness and depression, and higher level of cardiovascular risk factors (e.g., obesity, physical inactivity, hypertension, diabetes, strokes, heart disease),” Zheng said in an email to Gizmodo. He also noted that the U.S. has its own added roadblocks, such as the lack of universal, affordable health care.
The findings may also explain a seemingly contradictory pattern earlier research had shown, Zheng said. Baby boomers are already known to experience more chronic health problems than earlier recent generations did at their age. But other studies have suggested that the incidence rate of dementia and cognitive impairment among older Americans has been improving over the decades, relative to past generations. Zheng’s research suggests that this trend won’t continue and that we simply haven’t reached the point in time where many boomers would become more affected by conditions like Alzheimer’s.
However, the oldest baby boomers are now in their 70s, while the youngest are approaching their 60s. So if Zheng’s findings do represent a genuine widespread decline in cognition across generations, then it’s likely there will be a increase in the rates of Alzheimer’s disease and other forms of dementia, since early cognitive decline is a major risk factor of later dementia. And if whatever factors that are negatively affecting baby boomers are still around or get worse in the future, then Gen-Xers and millennials eventually could be facing the same or larger challenge.
Because there are now more older Americans than ever, public health experts predict there will be more cases of Alzheimer’s in the near future. The Alzheimer’s Association, for instance, estimates that the number of Americans over age 65 with Alzheimer’s may grow to 13.8 million by 2050, up from the 5.8 million believed to have the disease now. It’s possible, according to Zheng, that the problem could be even worse than we think. At the same time, Zheng doesn’t think that we’re necessarily doomed.
“Cognitive functioning may continue declining among baby boomers if no effective interventions and policy responses are in place, which may cause the prevalence of dementia to substantially increase in the coming decades,” he said. “But this is not an irreversible trend.” Zheng suggested that everyone can strive for more physical activity, a healthy diet, and strong social bonds to lower their risk of cognitive decline later in life.
Science writer at Gizmodo and pug aficionado elsewhere
As a neuroscientist, this study is alarmist at the very least and suffers from huge experimental flaws in the measurement and comparison of the various so called generational groups. So many numerous factors can bias the results and the post-hoc conclusions of the authors of this study. First of all the cognitive measures and data collection have radically changed over time where no true standardization of cognitive testing is possible. Also the limited scope of the so-called cognitive phenotypes or markers are subject to many biases or extraneous factors that can and do exist. Also the aspect of self reporting as described in the study is inherently flawed, especially that today’s generations have to deal with much greater amounts of information and required knowledge to navigate and negotiate the ever increasing challenges of modern technology and basic knowhow. Earlier generations had much less to contend with than current ones. Just the inherent stress factors alone are enough to place greater limitations on available cognitive resources of any given individual. With greater advances in health care and medical advancement, people are living longer. Also our prior knowledge of cognitive decline was far inferior in past generations where dementia was poorly understood and even not measured in any equivalent way compared to today’s standards. As people live longer due to numerous advances in medicine, personal health awareness and neuroscience, our definitions have drastically changed even since the 1980's! Diagnostic definitions have radically changed over the years so in a sense this study is attempting to compare apples with oranges, grapes, Kiwi, ECT... It is very irresponsible on the authors part to draw causal conclusions towards current and future demographics to claim such an absolute decline of subsequent groups in the population! At best this study should be negated as a cautionary tale of how science can be used in reckless ways causing great misinformation amongst the laymen or laypeople of today’s ever changing world!
Craig A. Goodman, Ph.D.
There are obviously limitations to this study and more research should be conducted to confirm this pattern, which I can always be clearer about. That said, I do want to point out that this study *did not* compare different sets of cognitive testing data across time; it compared age groups of people who took the same basic tests as part as the same, very large, and long-lasting research study. Reading the paper, the authors also seemed to take steps to exclude data from people who took different, earlier versions of these tests.
Of course, this method also comes with possible caveats. Maybe the tests during that 20-year-period did change in some unaccounted way; maybe the measures used really aren’t a good enough proxy for cognitive function; and the authors didn’t have data from the entire span of later life for all generations, since the study has only run since the mid-1990s. But just wanted to address that confusion!