Tag: 7/3/23

Difficulty Falling Asleep Linked to Developing Dementia

Old man
Source: JD Mason on Unsplash

Adding to the growing body of evidence on sleep disturbances and cognitive impairment, new research published in the American Journal of Preventive Medicine, finds significant links between three measures of sleep disturbance and the risk for developing dementia over a 10-year period. Difficulties falling asleep were linked to higher risk, but not falling asleep again after waking.

The results associate sleep-initiation insomnia (trouble falling asleep within 30 min) and sleep medication use with higher dementia risk. An additional, surprising finding was that people who reported having sleep-maintenance insomnia (trouble falling back to sleep after waking) were less likely to develop dementia over the course of the study.

“We expected sleep-initiation insomnia and sleep medication usage to increase dementia risk, but we were surprised to find sleep-maintenance insomnia decreased dementia risk,” explained lead investigator Roger Wong, PhD, MPH, MSW, an Assistant Professor in the Department of Public Health and Preventive Medicine, SUNY Upstate Medical University. “The motivation behind this research was prompted on a personal level. My father has been experiencing chronic sleep disturbances since the COVID pandemic began, and I was concerned how this would affect his cognition in the future. After reading the existing literature, I was surprised to see mixed findings on the sleep-dementia relationship, so I decided to investigate this topic.”

This research is novel because it is the first to examine how long-term sleep disturbance measures are associated with dementia risk using a nationally representative US older adult sample. Previous research has associated REM sleep behavior, sleep deprivation (less than five hours of sleep), and the use of short-acting benzodiazepines with cognitive decline. Their results for sleep-maintenance insomnia support other recent studies using smaller, separate data samples.

This study used 10 annual waves (2011–2020) of prospective data from the National Health and Aging Trends Study (NHATS), a longitudinal panel study that surveys a nationally representative sample of Medicare beneficiaries aged 65 years and older within the USA. This study included only people who were dementia-free at baseline in 2011.

While the mechanism for decreased dementia risk among those with sleep-maintenance insomnia is still unknown, the investigators theorise that greater engagement in activities that preserve or increase cognitive reserve may thereby decrease dementia risk.

Recent evidence indicates there is a higher prevalence of sleep disturbances among older adults than among other age groups. This could be attributed to a variety of factors including anxiety about the COVID pandemic or warmer nights as a consequence of climate change.

“Older adults are losing sleep over a wide variety of concerns. More research is needed to better understand its causes and manifestations and limit the long-term consequences,” added Dr Wong. “Our findings highlight the importance of considering sleep disturbance history when assessing the dementia risk profile for older adults. Future research is needed to examine other sleep disturbance measures using a national longitudinal sample, whether these sleep-dementia findings hold true for specific dementia subtypes, and how certain sociodemographic characteristics may interact with sleep disturbances to influence dementia risk.”

Source: Elsevier

Breast Cancer Stage and Receptor Type Predict Recurrence

Photo by National Cancer Institute on Unsplash

New research indicates that for patients with breast cancer, the cancer’s stage and receptor status can help clinicians predict whether and when cancer might recur after initial treatment. The findings are published in the journal cancer CANCER.

For the study, Heather Neuman, MD, MS, of the University of Wisconsin, and her colleagues analysed data on 8007 patients with stage I–III breast cancer who participated in nine clinical trials from 1997–2013 and received standard of care therapy.

Time to first cancer recurrence varied significantly between cancers with different receptors – including oestrogen receptor (ER), progesterone receptor (PR), and human epidermal growth factor receptor 2 (HER2). Within each receptor type, cancer stage influenced time to recurrence.

Risk of recurrence was highest and occurred earliest for ER−/PR−/HER2− (triple negative) tumours. Patients with these tumours diagnosed at stage III had a 5-year probability of recurrence of 45.5%. Risk of recurrence was lowest for ER+/PR+/HER2+ (triple positive) tumours. Patients with these tumours diagnosed at stage III had a 5-year probability of recurrence of 15.3%.

Based on their findings, the investigators developed follow-up recommendations by cancer stage and receptor type. For example, patients with the lowest risk should be seen by their oncology team once annually over five years, whereas those with the highest risk should be seen once every three months over five years.

“Our developed follow-up guidelines present an opportunity to personalize how we deliver breast cancer follow-up care,” said Dr Neuman. “By tailoring follow-up based on risk, we have the potential to have a strong, positive impact on both survivors and their oncology providers by improving the quality and efficiency of care.”

Source: Wiley

Novel Antihypertensive Flounders in Early Trial Phase

Blood pressure cuff
BP cuff for home monitoring, Source: Pixabay

A phase II trial with the novel antihypertensive baxdrostat did not replicate the impressive results in a similar trial for the drug in treatment-resistant hypertension, failing to improve on placebo effect.

Deepak Bhatt, MD, MPH, of Mount Sinai Heart in New York City, presented the disappointing findings at the American College of Cardiology (ACC) annual meeting, but noted that the findings were not a complete write-off for the drug, hampered as the trial was by poor patient adherence and the confounding effect of other antihypertensives.

For baxdrostat, seated systolic blood pressure was lowered by 16.0–19.8mmHg across the doses tested, compared to 16.6mmHg for placebo, a nonsignificant difference. Diastolic blood pressure drops showed a similar pattern, even slightly favouring placebo.

HALO included 249 participants with a mean seated systolic blood pressure of 140–180 mmHg at baseline despite treatment with a stable regimen of an ACE inhibitor or one of those drugs plus a thiazide diuretic or a calcium channel blocker. They were randomised to placebo or a 0.5-, 1.0-, or 2.0-mg dose of baxdrostat for 8 weeks.

In the prior phase II BrighHTN trial, baxdrostat reduced systolic blood pressure by 11 and 8.1 mm Hg more than placebo in the two higher dose groups.

The drug, which is in a new class of highly selective aldosterone synthase inhibitors, did decrease serum aldosterone and increase plasma renin activity as expected compared with placebo in HALO.

A post hoc analysis to understand why the trial failed despite high pill-count based adherence showed that 36% of the baxdrostat patients in the highest, 2-mg dose group (20 of 54) were actually not adherent, based on plasma levels < 1% of expected.

ACC session moderator Kim Eagle, MD, of the University of Michigan in Ann Arbor wondered if the patients were flushing their pills, and Bhatt replied that these were clustered at a few sites, highlighting issues of site selection and providing patient support.

The adherence problem does not explain away the placebo effect, Eagle told MedPage Today. “The placebo effect may well be that by enrolling in a trial, the patient is also taking their other meds for hypertension. Recall that the patients were already supposed to be taking several antihypertensives.”

Nevertheless, he called it compelling that, in “patients who were taking the larger dose and who had evidence of adherence by blood levels, the drug clearly seems to work.”

Source: MedPage Today

Hair Analysis Reveals Double the Number of Adolescent Substance Users

Photo by Brandi Redd on Unsplash

Far more children and adolescents could be using drugs than admitted to in surveys, according to a new US survey using hair analysis to test for actual drug intake. Published in the peer-reviewed journal American Journal of Drug and Alcohol Abuse, the study of nearly 1300 children aged 9–13 found that, in addition to the 10% self-reporting drug use, an additional 9% had used drugs as determined by hair analysis.

The paper suggests hair analysis far outweighs the accuracy of assessing drug use compared to survey alone, and experts recommend that future research should combine both methods.

“It’s vital that we understand the factors that lead to drug use in teenagers, so that we can design targeted health initiatives to prevent children from being exposed to drugs at a young age,” says study leader Natasha Wade, an assistant professor of psychology at the University of California, San Diego.

Adolescent substance use is a serious public health issue, with 5% of US 8th graders (ages 13–14) reporting cannabis use in the last year. The numbers are even higher for alcohol and nicotine use, with 26% of 8th graders admitting to drinking and 23% to smoking nicotine in the past year. These numbers are worrying, as substance use in adolescence is linked to negative life outcomes, but they may be even higher.

To find out a multidisciplinary team of experts, led by Dr Wade, asked 1390 children whether they had taken drugs in the last year. Hair samples were then also taken so that independent tests could confirm whether recent drug-taking had taken place.

Of the children who were asked if they had taken drugs, 10% agreed that they had. Hair analyses also showed that 10% of adolescents overall tested positive for at least one drug, with 6.1% testing positive for cannabinoids, 1.9% alcohol, 1.9% amphetamines, and 1.7% cocaine.

However, the children that self-reported drug-taking were not the same as those who tested positive through hair samples. In fact, of the 136 cases that self-reported any substance use and 145 whose hair samples were positive for any drug, matches were found for only 23 cases.

Most importantly, hair drug analysis revealed an additional 9% of substance use cases over and above self-report alone, nearly doubling the number of identified substance users to 19%.

“A long-standing issue in substance use research, particularly that relating to children and adolescents, is a reliance on self-reporting despite the known limitations to the methodology. When asked, children may mis-report (unintentionally or intentionally) and say they take drugs when they don’t, or conversely deny taking drugs when they actually do,” Dr Wade adds.

“But rather than scrapping self-reporting of drug use altogether, a more accurate picture of teenage substance use can be gained by measuring both.

“Self-reporting has its own strengths, for instance young people may be more willing to disclose substance use at a low level, but are less likely to when frequent drug-taking patterns emerge.

“Conversely, hair assays are not sensitive enough to detect only one standard drink of alcohol or smoking one cannabis joint. Instead, the method is better at detecting frequent and moderate to heavy drug use.

“Combining both methodologies is therefore vital to accurately determine the levels of substance use in the teenage population.”

Commenting on the findings of their paper, the authors also add however, that it is important to note that there is a chance that some, perhaps even many, of these youth are unaware that they even used a substance, as it could have been given to them by a parent or peer or they may have simply forgotten they had used it.

Source: Taylor & Francis Group

Cytotoxic T Cells Become ‘Marathon Runners’ to Wage Long Immune Battles

Shown here is a pseudo-colored scanning electron micrograph of an oral squamous cancer cell (white) being attacked by two cytotoxic T cells (red), part of a natural immune response. Photo by National Cancer Institute on Unsplash

When it comes to chronic infections and cancer, cytotoxic T cells play a central role in our defences. Research published in the journal Immunity has revealed that these cells can specialise into “sprinters” to fight a strong, short-term infection or into “marathon runners” for the long battle against chronic infections and cancer.

Professor Daniel Pinschewer at the Department of Biomedicine of the University of Basel led a study into understanding how cytotoxic T cells adapt to infection and cancer.

“These T cells can become specialised in two different ways: either as a kind of sprinter or as marathon runners,” explains Pinschewer. “However, the latter can also convert into sprinters at any time, in order to stamp out an infection.”

Chronic infections are a special case: the T cells are activated and a strong inflammatory response occurs at the same time. “This tends to ‘shock’ the T cells into developing into sprinters, which can only intervene effectively in the short term to remove infected cells,” says the virologist. “If all T cells behaved like that, our immune defences would break down pretty soon.”

Biological messenger counteracts the “shock”

The researchers examined how, in spite of this, the immune system is still able to provide enough T cells for the endurance race against chronic infections. According to their results, a biological messenger called interleukin-33 (IL-33) plays a key role. It allows the T cells to remain in their “marathon runner” state. “IL-33 takes away the shock of the inflammation, so to speak,” explains Dr Anna-Friederike Marx, lead author of the study.

In addition, the biological messenger causes the marathon T cells to proliferate, so that more endurance runners are available to combat the infection. “Thanks to IL-33, there are enough cytotoxic T cells around for the long haul that can still pull off a final sprint after their marathon,” says Marx.

The findings could help improve the treatment of chronic infections such as hepatitis C. It is conceivable that IL-33 could be administered to support an effective immune response. Thinking along the same lines, IL-33 could be one key to improving cancer immunotherapy, to enable T cells to wage an efficient and long-lasting offensive against tumour cells.

Source: University of Basel