It is important for scientists to detect risk factors for cognitive decline before a patient reaches old age, when it is often too late to intervene. Scanning a patient for early risk factors when they are in their 40s to 50s can allow medical practitioners to act within a reasonable window of opportunity.
“So, how do we detect changes without having to give everyone an expensive brain scan? As it turns out, the contents of blood may cause the brain to age,” Nolan and Allard reported to the Conversation.
“With time, our cells and organs slowly deteriorate, and the immune system can react to this by starting the process of inflammation,” they said. “Inflammatory molecules can then end up in the bloodstream, make their way to the brain, interfere with its normal functioning and possibly impair cognition.”
Scientists at Johns Hopkins and the University of Mississippi put this concept to the test in a 2019 study, where researchers analyzed the presence of inflammatory molecules in the blood of middle-aged adults with enough precision to be able to predict cognitive changes 20 years down the line.
“‘Middle ageing’ may be more consequential for our future brain health than we think,” Nolan and Allard said. “The hurried ticking of the clock could be slowed from outside the brain. For example, physical exercise confers some of its beneficial effects on the brain through blood-borne messengers. These can work to oppose the effects of time. If they could be harnessed, they might steady the pendulum.”
About the Author