Tag: dementia

Researchers Discover Natural Compound may Slow ALS and Dementia

Researchers from the University of Missouri have discovered that kaempferol, a natural antioxidant found in certain fruits and vegetables, such as kale, berries and endives, may support nerve cell health and holds promise as a potential treatment for ALS. Photo: Pixabay CC0

A natural compound found in everyday fruits and vegetables may hold the key to protecting nerve cells — and it’s showing promise as a potential treatment for ALS and dementia, according to new research from the University of Missouri.

“It’s exciting to discover a naturally occurring compound that may help people suffering from ALS or dementia,” Smita Saxena, a professor of physical medicine and rehabilitation at the School of Medicine and lead author of the study, said. “We found this compound had a strong impact in terms of maintaining motor and muscle function and reducing muscle atrophy.”

The study, which appears in Acta Neurologica, discovered that kaempferol, a natural antioxidant found in certain fruits and vegetables, such as kale, berries and endives, may support nerve cell health and holds promise as a potential treatment for ALS.

In lab-grown nerve cells from ALS patients, the compound helped the cells produce more energy and eased stress in the protein-processing center of the cell called the endoplasmic reticulum. Additionally, the compound improved overall cell function and slowed nerve cell damage. Researchers found that kaempferol worked by targeting a crucial pathway that helps control energy production and protein management — two functions that are disrupted in individuals with ALS.

“I believe this is one of the first compounds capable of targeting both the endoplasmic reticulum and mitochondria simultaneously,” Saxena said. “By interacting with both of these components within nerve cells, it has the potential to elicit a powerful neuroprotective effect.”

The challenge

The catch? The body doesn’t absorb kaempferol easily, and it could take a large amount to see real benefits in humans. For instance, an individual with ALS would need to consume at least 4.5kg of kale in a day to obtain a beneficial dose.

“Our bodies don’t absorb kaempferol very well from the vegetables we eat,” Saxena said. “Because of this, only a small amount reaches our tissues, limiting how effective it can be. We need to find ways to increase the dose of kaempferol or modify it so it’s absorbed into the bloodstream more easily.”

Another hurdle is getting the compound into the brain. The blood-brain barrier — a tightly locked layer of cells that blocks harmful substances — also makes it harder for larger molecules like kaempferol to pass through.

What’s next?

Despite its challenges, kaempferol remains a promising candidate for treating ALS, especially since it works even after symptoms start. It also shows potential for other neurodegenerative diseases including Alzheimer’s and Parkinson’s.

To make the compound easier for the body to absorb, Saxena’s team at the Roy Blunt NextGen Precision Health building is exploring ways to boost its uptake by neurons. One promising approach involves packaging lipid-based nanoparticles — tiny spherical particles made of fats that are commonly used in drug delivery.

“The idea is to encapsulate kaempferol within lipid-based nanoparticles that are easily absorbed by the neurons,” Saxena said.  “This would target kaempferol to neurons to greatly increase its beneficial effect.”

The team is currently generating the nanoparticles with hopes of testing them by the end of the year.

Source: University of Missouri-Columbia

Study Strengthens Link between Shingles Vaccine and Lower Dementia Risk

Photo by JD Mason on Unsplash

An unusual public health policy in Wales may have produced the strongest evidence yet that a vaccine can reduce the risk of dementia. In a new study led by Stanford Medicine, researchers analysing the health records of Welsh older adults discovered that those who received the shingles vaccine were 20% less likely to develop dementia over the next seven years than those who did not receive the vaccine.

The remarkable findings, published April 2 in Nature, support an emerging theory that viruses that affect the nervous system can increase the risk of dementia. If further confirmed, the new findings suggest that a preventive intervention for dementia is already close at hand.

Lifelong infection

Shingles, a viral infection that produces a painful rash, is caused by the same virus that causes chicken pox — varicella-zoster. After people contract chicken pox, usually in childhood, the virus stays dormant in the nerve cells for life. In people who are older or have weakened immune systems, the dormant virus can reactivate and cause shingles.

Dementia affects more than 55 million people worldwide, with an estimated 10 million new cases every year. Decades of dementia research has largely focused on the accumulation of plaques and tangles in the brains of people with Alzheimer’s, the most common form of dementia. But with no breakthroughs in prevention or treatment, some researchers are exploring other avenues — including the role of certain viral infections.

Previous studies based on health records have linked the shingles vaccine with lower dementia rates, but they could not account for a major source of bias: People who are vaccinated also tend to be more health conscious in myriad, difficult-to-measure ways. Behaviors such as diet and exercise, for instance, are known to influence dementia rates, but are not included in health records. 

“All these associational studies suffer from the basic problem that people who get vaccinated have different health behaviours than those who don’t,” said Pascal Geldsetzer, MD, PhD, assistant professor of medicine and senior author of the new study. “In general, they’re seen as not being solid enough evidence to make any recommendations on.”

Markus Eyting, PhD, and Min Xie, PhD, postdoctoral scholars in primary care and population health, are the study’s co-lead authors.

A natural experiment

But two years ago, Geldsetzer recognized a fortuitous “natural experiment” in the rollout of the shingles vaccine in Wales that seemed to sidestep the bias. The vaccine used at that time contained a live-attenuated, or weakened, form of the virus.

The vaccination program, which began Sept. 1, 2013, specified that anyone who was 79 on that date was eligible for the vaccine for one year. (People who were 78 would become eligible the next year for one year, and so on.) People who were 80 or older on Sept. 1, 2013, were out of luck — they would never become eligible for the vaccine. 

These rules, designed to ration the limited supply of the vaccine, also meant that the slight difference in age between 79- and 80-year-olds made all the difference in who had access to the vaccine. By comparing people who turned 80 just before Sept. 1, 2013, with people who turned 80 just after, the researchers could isolate the effect of being eligible for the vaccine.

The circumstances, well-documented in the country’s health records, were about as close to a randomized controlled trial as you could get without conducting one, Geldsetzer said. 

The researchers looked at the health records of more than 280 000 older adults who were 71 to 88 years old and did not have dementia at the start of the vaccination program. They focused their analysis on those closest to either side of the eligibility threshold — comparing people who turned 80 in the week before with those who turned 80 in the week after.

“We know that if you take a thousand people at random born in one week and a thousand people at random born a week later, there shouldn’t be anything different about them on average,” Geldsetzer said. “They are similar to each other apart from this tiny difference in age.”

The same proportion of both groups likely would have wanted to get the vaccine, but only half, those almost 80, were allowed to by the eligibility rules.

“What makes the study so powerful is that it’s essentially like a randomised trial with a control group — those a little bit too old to be eligible for the vaccine — and an intervention group — those just young enough to be eligible,” Geldsetzer said.

Protection against dementia

Over the next seven years, the researchers compared the health outcomes of people closest in age who were eligible and ineligible to receive the vaccine. By factoring in actual vaccination rates — about half of the population who were eligible received the vaccine, compared with almost none of the people who were ineligible — they could derive the effects of receiving the vaccine.

As expected, the vaccine reduced the occurrence over that seven-year period of shingles by about 37% for people who received the vaccine, similar to what had been found in clinical trials of the vaccine. (The live-attenuated vaccine’s effectiveness wanes over time.)

This huge protective signal was there, any which way you looked at the data.”

By 2020, one in eight older adults, who were by then 86 and 87, had been diagnosed with dementia. But those who received the shingles vaccine were 20% less likely to develop dementia than the unvaccinated.

“It was a really striking finding,” Geldsetzer said. “This huge protective signal was there, any which way you looked at the data.”

The scientists searched high and low for other variables that might have influenced dementia risk but found the two groups to be indistinguishable in all characteristics. There was no difference in the level of education between the people who were eligible and ineligible, for example. Those who were eligible were not more likely to get other vaccinations or preventive treatments, nor were they less likely to be diagnosed with other common health conditions, such as diabetes, heart disease and cancer.

The only difference was the drop in dementia diagnoses.

“Because of the unique way in which the vaccine was rolled out, bias in the analysis is much less likely than would usually be the case,” Geldsetzer said.

Nevertheless, his team analyzed the data in alternate ways — using different age ranges or looking only at deaths attributed to dementia, for example — but the link between vaccination and lower dementia rates remained.

“The signal in our data was so strong, so clear and so persistent,” he said.

Stronger response in women

In a further finding, the study showed that protection against dementia was much more pronounced in women than in men. This could be due to sex differences in immune response or in the way dementia develops, Geldsetzer said. Women on average have higher antibody responses to vaccination, for example, and shingles is more common in women than in men.

Whether the vaccine protects against dementia by revving up the immune system overall, by specifically reducing reactivations of the virus or by some other mechanism is still unknown.

Also unknown is whether a newer version of the vaccine, which contains only certain proteins from the virus and is more effective at preventing shingles, may have a similar or even greater impact on dementia.

Geldsetzer hopes the new findings will inspire more funding for this line of research.

“At least investing a subset of our resources into investigating these pathways could lead to breakthroughs in terms of treatment and prevention,” he said.

In the past two years, his team has replicated the Wales findings in health records from other countries, including England, Australia, New Zealand and Canada, that had similar rollouts of the vaccine. “We just keep seeing this strong protective signal for dementia in dataset after dataset,” he said.

But Geldsetzer has set his sights on a large, randomized controlled trial, which would provide the strongest proof of cause and effect. Participants would be randomly assigned to receive the live-attenuated vaccine or a placebo shot.

“It would be a very simple, pragmatic trial because we have a one-off intervention that we know is safe,” he said.

Geldsetzer is seeking philanthropic funding for the trial as the live-attenuated vaccine is no longer manufactured by pharmaceutical companies.  

And such a trial might not take long to see results. He pointed to a graph of the Wales data tracking the dementia rates of those who were eligible and ineligible for the vaccine. The two curves began to separate in about a year and a half.

Source: Stanford Medicine

Could a Blood Test Rule out Future Dementia Risk?

Researchers at Karolinska Institutet have demonstrated how specific biomarkers in the blood can predict the development of dementia up to 10 years before diagnosis with high accuracy, among older adults living independently in the community.

A new study, published in Nature Medicine, has investigated the potential of specific biomarkers such as tau217, Neurofilament Light (NfL), and Glial Fibrillary Acidic Protein (GFAP) to predict the occurrence of dementia, including Alzheimer’s disease, up to ten years before an actual diagnosis in cognitively healthy older adults living in the community. 

Blood samples from more than two thousand

Previous research has suggested that these biomarkers could be useful in early dementia diagnostics, but most studies involved individuals who have already sought medical care for cognitive issues, due to cognitive concerns or cognitive symptoms, such as memory difficulties. 

A larger, community-based study, was necessary to determine the predictive value of biomarkers in the general population.

Led by researchers from the Aging Research Center of Karolinska Institutet in collaboration with SciLifeLab and KTH Royal Institute of Technology in Stockholm, the study analysed blood biomarkers in more than 2100 adults aged 60+, who were followed over time to determine if they developed dementia.

At a follow-up ten years later, 17% of participants had developed dementia. The accuracy of the biomarkers used in the study was found to be up to 83%.

“This is an encouraging result, especially considering the 10-year predictive window between testing and diagnosis. It shows that it is possible to reliably identify individuals who develop dementia and those who will remain healthy,” says Giulia Grande, assistant professor at the Department of Neurobiology, Care Sciences and Society, Karolinska Institutet and first author of the study.

Promising biomarkers

“Our findings imply that if an individual has low levels of these biomarkers, their risk of developing dementia over the next decade is minimal”, explains Davide Vetrano, associate professor at the same department and the study’s senior author. “This information could offer reassurance to individuals worried about their cognitive health, as it potentially rules out the future development of dementia.”

However, the researchers also observed that these biomarkers had low positive predictive values, meaning elevated biomarker levels alone could not reliably identify individuals who would surely develop dementia within the next ten years. Therefore, the study authors advise against widespread use of these biomarkers as screening tools in the population at this stage.

“These biomarkers are promising, but they are currently not suitable as standalone screening tests to identify dementia risk in the general population,” says Davide Vetrano. 

The researchers also noted that a combination of the three most relevant biomarkers – p-tau217 with NfL or GFAP – could improve predictive accuracy.

“Further research is needed to determine how these biomarkers can be effectively used in real-world settings, especially for elderly living in the community or in primary health care services,” says Grande.

“We need to move a step further and see whether the combination of these biomarkers with other clinical, biological or functional information could improve the possibility of these biomarkers to be used as screening tools for the general population”, Grande continues.

The study was mainly funded by the Swedish Research Council, The Swedish Brain Foundation and The Strategic Research Area in Epidemiology and Biostatistics at Karolinska Institutet. The researchers declare that there are no conflicts of interest.

Source: Karolinska Institutet

Can Long-term Use of Anti-inflammatory Medications Prevent Dementia?

Photo by cottonbro studio

Past research has suggested that inflammation may contribute to the development and progression of dementia and that non-steroidal anti-inflammatory (NSAID) medications may help protect against dementia due to their anti-inflammatory effects. A new large prospective study published in the Journal of the American Geriatrics Society provides additional evidence, showing that long-term NSAID use is linked to a decreased risk of developing dementia.

In the population-based study of 11 745 adults with an average follow-up of 14.5 years, 9520 participants had used NSAIDs at any given time, and 2091 participants developed dementia. Long-term NSAID use was associated with a 12% reduced risk of developing dementia. Short- and intermediate-term use did not provide benefits. Also, the cumulative dose of NSAIDs was not associated with decreased dementia risk.

The findings suggest that prolonged, rather than intensive, use of anti-inflammatory medications may help protect against dementia.

“Our study provides evidence on possible preventive effects of anti-inflammatory medication against the dementia process. There is a need for more studies to further consolidate this evidence and possibly develop preventive strategies,” said corresponding author M. Arfan Ikram, MSc, MD, PhD, of Erasmus MC University Medical Center Rotterdam, in the Netherlands.

Source: Wiley

‘Healthy’ Vitamin B12 Levels not Enough to Ward off Neuro Decline

Created with Gencraft. CC4.0

Meeting the minimum requirement for vitamin B12, needed to make DNA, red blood cells and nerve tissue, may not actually be enough – particularly if for older adults. It may even put them at risk for cognitive impairment, according to a study published in Annals of Neurology.

The research found that older, healthy volunteers, with lower concentrations of B12, but still in the normal range, showed signs of neurological and cognitive deficiency. These levels were associated with more damage to the brain’s white matter – the nerve fibres that enable communication between areas of the brain – and test scores associated with slower cognitive and visual processing speeds, compared to those with higher B12.

The UC San Francisco researchers, led by senior author Ari J. Green, MD, of the Departments of Neurology and Ophthalmology and the Weill Institute for Neurosciences, said that the results raise questions about current B12 requirements and suggest the recommendations need updating.

“Previous studies that defined healthy amounts of B12 may have missed subtle functional manifestations of high or low levels that can affect people without causing overt symptoms,” said Green, noting that clear deficiencies of the vitamin are commonly associated with a type of anaemia. “Revisiting the definition of B12 deficiency to incorporate functional biomarkers could lead to earlier intervention and prevention of cognitive decline.”

Lower B12 correlates with slower processing speeds, brain lesions

In the study, researchers enrolled 231 healthy participants without dementia or mild cognitive impairment, whose average age was 71. They were recruited through the Brain Aging Network for Cognitive Health (BrANCH) study at UCSF.

Their blood B12 amounts averaged 414.8pmol/L, well above the U.S. minimum of 148pmol/L. Adjusted for factors like age, sex, education and cardiovascular risks, researchers looked at the biologically active component of B12, which provides a more accurate measure of the amount of the vitamin that the body can utilize. In cognitive testing, participants with lower active B12 were found to have slower processing speed, relating to subtle cognitive decline. Its impact was amplified by older age. They also showed significant delays responding to visual stimuli, indicating slower visual processing speeds and general slower brain conductivity.

MRIs revealed a higher volume of lesions in the participants’ white matter, which may be associated with cognitive decline, dementia or stroke.

While the study volunteers were older adults, who may have a specific vulnerability to lower levels of B12, co-first author Alexandra Beaudry-Richard, MSc, said that these lower levels could “impact cognition to a greater extent than what we previously thought, and may affect a much larger proportion of the population than we realize.” Beaudry-Richard is currently completing her doctorate in research and medicine at the UCSF Department of Neurology and the Department of Microbiology and Immunology at the University of Ottawa.

“In addition to redefining B12 deficiency, clinicians should consider supplementation in older patients with neurological symptoms even if their levels are within normal limits,” she said. “Ultimately, we need to invest in more research about the underlying biology of B12 insufficiency, since it may be a preventable cause of cognitive decline.”

Source: University of California – San Francisco

Long-term Study Finds Red Meat Raises Dementia Risk

Photo by Jose Ignacio Pompe on Unsplash

People who eat more red meat, especially processed red meat like bacon, sausage and bologna, are more likely to have a higher risk of cognitive decline and dementia when compared to those who eat very little red meat, according to a study published in the January 15, 2025, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Red meat is high in saturated fat and has been shown in previous studies to increase the risk of type 2 diabetes and heart disease, which are both linked to reduced brain health,” said study author Dong Wang, MD, ScD, of Brigham and Women’s Hospital in Boston. “Our study found processed red meat may increase the risk of cognitive decline and dementia, but the good news is that it also found that replacing it with healthier alternatives, like nuts, fish and poultry, may reduce a person’s risk.”

To examine the risk of dementia, researchers included a group of 133 771 people (65.4% female) with an average age of 49 who did not have dementia at the start of the study. They were followed up to 43 years. Of this group, 11 173 people developed dementia.

Participants completed a food diary every two to four years, listing what they ate and how often.

Researchers defined processed red meat as bacon, hot dogs, sausages, salami, bologna and other processed meat products. They defined unprocessed red meat as beef, pork, lamb and hamburger. A serving of red meat is three ounces (85gm), about the size of a deck of cards.

For processed red meat, they divided participants into three groups. The low group ate an average of fewer than 0.10 servings per day; the medium group ate between 0.10 and 0.24 servings per day; and the high group, 0.25 or more servings per day.

After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that participants in the high group had a 13% higher risk of developing dementia compared to those in the low group.

For unprocessed red meat, researchers compared people who ate an average of less than one half serving per day to people who ate one or more servings per day and did not find a difference in dementia risk.

To measure subjective cognitive decline, researchers looked at a different group of 43,966 participants with an average age of 78. Subjective cognitive decline is when a person reports memory and thinking problems before any decline is large enough to show up on standard tests.

The subjective cognitive decline group took surveys rating their own memory and thinking skills twice during the study.

After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that participants who ate an average of 0.25 servings or more per day of processed red meat had a 14% higher risk of subjective cognitive decline compared to those who ate an average of fewer than 0.10 servings per day.

They also found people who ate one or more servings of unprocessed red meat per day had a 16% higher risk of subjective cognitive decline compared to people who ate less than a half serving per day.

To measure objective cognitive function, researchers looked at a different group of 17 458 female participants with an average age of 74. Objective cognitive function is how well your brain works to remember, think and solve problems.

This group took memory and thinking tests four times during the study.

After adjusting for factors such as age, sex and other risk factors for cognitive decline, researchers found that eating higher processed red meat was associated with faster brain aging in global cognition with 1.61 years with each additional serving per day and in verbal memory with 1.69 years with each additional serving per day.

Finally, researchers found that replacing one serving per day of processed red meat with one serving per day of nuts and legumes was associated with a 19% lower risk of dementia and 1.37 fewer years of cognitive aging. Making the same substitution for fish was associated with a 28% lower risk of dementia and replacing with chicken was associated with a 16% lower risk of dementia.

“Reducing how much red meat a person eats and replacing it with other protein sources and plant-based options could be included in dietary guidelines to promote cognitive health,” said Wang. “More research is needed to assess our findings in more diverse groups.”

A limitation of the study was that it primarily looked at white health care professionals, so the results might not be the same for other race, ethnic and non-binary sex and gender populations.

Source: American Academy of Neurology

Long-term Study Finds Link between Earlier Diabetes Diagnosis and Dementia Risk

Photo by Nataliya Vaitkevich on Pexels

People diagnosed with type 2 diabetes at a younger age are at a higher risk for developing dementia than those diagnosed later in life, according to a study led by researchers at the NYU Rory Meyers College of Nursing. The findings, published in PLOS ONE, show that the increased risk is especially pronounced among adults with obesity.

“Our study suggests that there may be cognitive consequences to earlier onset type 2 diabetes, and it points to the need for strategies to prevent dementia that consider both diabetes and obesity,” said Xiang Qi, assistant professor at NYU Meyers and the study’s first author.

Type 2 diabetes is a known risk factor for dementia. Although the underlying mechanisms are not fully understood, scientists think that some of the hallmarks of diabetes, such as high blood sugar, insulin resistance, and inflammation, may encourage the development of dementia in the brain.

While type 2 diabetes was once a disease of older adults, it is increasingly prevalent among younger individuals: one in five people with type 2 diabetes worldwide is under 40 years old.

To understand how the timing of a type 2 diabetes diagnosis relates to dementia risk, the research team analyzed data from 2002 to 2016 in the Health and Retirement Study, a longitudinal study conducted by the University of Michigan Institute for Social Research. The PLOS ONE study included 1213 US adults aged 50 and over with type 2 diabetes confirmed by blood tests, without dementia at baseline. Following participants for up to 14 years, 216 (17.8%) developed dementia based on follow-up telephone interviews.

The researchers found that adults diagnosed with type 2 diabetes at younger ages were at increased risk for developing dementia, compared to those diagnosed at 70 years or older. Adults diagnosed with diabetes before age 50 were 1.9 times as likely to develop dementia as those diagnosed at 70 and older, while those diagnosed between 50–59 years were 1.72 times as likely and those diagnosed between 60–69 years were 1.7 times as likely.

Using linear trend tests, the researchers found a graded association between age at diagnosis and dementia risk: for each year younger a person is at the time of their type 2 diabetes diagnosis, their risk for developing dementia increases by 1.9%.

“While we do not know for sure why an earlier diabetes diagnosis would increase the risk for dementia, prior studies show that people diagnosed with type 2 diabetes in mid-life may experience more vascular complications, poor blood sugar control, and insulin resistance – all of which are known risk factors for cognitive impairment,” said Bei Wu, the Dean’s Professor in Global Health and vice dean for research at NYU Meyers and the study’s senior author.

In addition, obesity appeared to influence the relationship between type 2 diabetes and dementia. Individuals with obesity who were diagnosed with type 2 diabetes before age 50 had the highest dementia risk in the study.

The researchers note that this greater understanding of the connection between diabetes onset, obesity, and dementia may help inform targeted interventions to prevent dementia.

“Our study highlights the importance of one’s age at diabetes diagnosis and suggests that specifically targeting obesity – whether through diet and exercise or perhaps medication – may play a role in staving off dementia in younger adults with diabetes,” said Wu.

Source: New York University

Extra Year of Education does Not Protect the Brain

Photo by Andrea Piacquadio on Pexels

Thanks to a ‘natural experiment’ involving 30 000 people, researchers at Radboud university medical centre were able to very precisely determine the effect of an extra year of education to the brain in the long term. To their surprise, they found no effect on brain structure and no protective benefit of additional education against brain ageing. Their findings appear in eLife.

It is well-known that education has many positive effects. People who spend more time in school are generally healthier, smarter, and have better jobs and higher incomes than those with less education. However, whether prolonged education actually causes changes in brain structure over the long term and protects against brain ageing, was still unknown.

It is challenging to study this, because alongside education, many other factors influence brain structure, such as the conditions under which someone grows up, DNA traits, and environmental pollution. Nonetheless, researchers Rogier Kievit (PI of the Lifespan Cognitive Dynamics lab) and Nicholas Judd from Radboudumc and the Donders Institute found a unique opportunity to very precisely examine the effects of an extra year of education.

Ageing

In 1972, a change in the law in the UK raised the number of mandatory school years from 15 to 16, while all other circumstances remained constant. This created an interesting ‘natural experiment’, an event not under the control of researchers which divides people into an exposed and unexposed group. Data from approximately 30 000 people who attended school around that time, including MRI scans taken much later (46 years after), is available. This dataset is the world’s largest collection of brain imaging data.

The researchers examined the MRI scans for the structure of various brain regions, but they found no differences between those who attended school longer and those who did not. ‘This surprised us’, says Judd. ‘We know that education is beneficial, and we had expected education to provide protection against brain aging. Aging shows up in all of our MRI measures, for instance we see a decline in total volume, surface area, cortical thickness, and worse water diffusion in the brain. However, the extra year of education appears to have no effect here.’

Brain structure

It’s possible that the brain looked different immediately after the extra year of education, but that wasn’t measured. “Maybe education temporarily increases brain size, but it returns to normal later. After all, it has to fit in your head,” explains Kievit. “It could be like sports: if you train hard for a year at sixteen, you’ll see a positive effect on your muscles, but fifty years later, that effect is gone.” It’s also possible that extra education only produces microscopic changes in the brain, which are not visible with MRI.

Both in this study and in other, smaller studies, links have been found between more education and brain benefits. For example, people who receive more education have stronger cognitive abilities, better health, and a higher likelihood of employment. However, this is not visible in brain structure via MRI. Kievit notes: “Our study shows that one should be cautious about assigning causation when only a correlation is observed. Although we also see correlations between education and the brain, we see no evidence of this in brain structure.”

Source: Radboud University Medical Centre

New Research Shows that Recombinant Shingles Vaccine Protects Against Dementia

Photo by JD Mason on Unsplash

New research published in Nature has shown that the recombinant shingles vaccine, as with the live version, might have a protective effect against dementia.

While evidence is emerging that the live herpes zoster (shingles) vaccine might protect against dementia, it has now been replaced by recombinant vaccines in many countries. But a lack of data meant that whether the recombinant vaccines conferred the same benefit was unknown. Fortunately, since there was a rapid switch from live to recombinant vaccines, there was an opportunity for a natural experiment to compare the risk of dementia between vaccine types.

The study demonstrated that the recombinant vaccine is associated with a significantly lower risk of dementia in the 6 years post-vaccination. Specifically, receiving the recombinant vaccine is associated with a 17% increase in diagnosis-free time, translating into 164 additional days lived without a diagnosis of dementia in those subsequently affected.

The recombinant shingles vaccine was also associated with lower risks of dementia than were two other vaccines commonly used in older people: influenza and tetanus–diphtheria–pertussis vaccines. The effect was robust across multiple secondary analyses, and was present in both men and women but was greater in women. These findings should stimulate studies investigating the mechanisms underpinning the protection and could facilitate the design of a large-scale randomised control trial to confirm the possible additional benefit of the recombinant shingles vaccine.

SGLT-2 Inhibitors may Lower Risk of Dementia and Parkinson’s Disease

Created with Gencraft. CC4.0

A class of drugs for diabetes may be associated with a lower risk of dementia and Parkinson’s disease, according to a study published in Neurology®, the medical journal of the American Academy of Neurology. The study looked at sodium-glucose cotransporter-2 (SGLT2) inhibitors, which are also known as gliflozins. They lower blood sugar by causing the kidneys to remove sugar from the body through urine.

“We know that these neurodegenerative diseases like dementia and Parkinson’s disease are common and the number of cases is growing as the population ages, and people with diabetes are at increased risk of cognitive impairment, so it’s encouraging to see that this class of drugs may provide some protection against dementia and Parkinson’s disease,” said study author Minyoung Lee, MD, PhD, of Yonsei University College of Medicine in Seoul, South Korea.

The retrospective study looked at people with type 2 diabetes who started diabetes medication from 2014 to 2019 in South Korea. People taking SGLT2 inhibitors were matched with people taking other oral diabetes drugs, so the two groups had people with similar ages, other health conditions and complications from diabetes.

Then researchers followed the participants to see whether they developed dementia or Parkinson’s disease. Those taking the SGLT2 inhibitors were followed for an average of two years and those taking the other drugs were followed for an average of four years.

Among the 358 862 participants with an average age of 58, a total of 6837 people developed dementia or Parkinson’s disease during the study. For Alzheimer’s disease, the incidence rate for people taking SGLT2 inhibitors was 39.7 cases per 10 000 person-years, compared to 63.7 cases for those taking other diabetes drugs. Person-years represent both the number of people in the study and the amount of time each person spends in the study.

For vascular dementia, which is dementia caused by vascular disease, the incidence rate for people taking the SGLT2 drugs was 10.6 cases per 10 000, compared to 18.7 for those taking the other drugs. For Parkinson’s disease, the incidence rate for those taking the SGLT2 drugs was 9.3 cases per 10 000, compared to 13.7 for those taking the other drugs.

After researchers adjusted for other factors that could affect the risk of dementia or Parkinson’s disease, such as complications from diabetes and medications, they found that SGLT2 inhibitor use was associated with a 20% reduced risk of Alzheimer’s disease and a 20% reduced risk of Parkinson’s disease. Those taking the drugs had a 30% reduced risk of developing vascular dementia.

“The results are generally consistent even after adjusting for factors like blood pressure, glucose, cholesterol and kidney function,” Lee said. “More research is needed to validate the long-term validity of these findings.” Lee said that since participants were followed for less than five years at the most, it’s possible that some participants would later develop dementia or Parkinson’s disease.

Source: American Academy of Neurology