Author: ModernMedia

Study Shows Effectiveness of Method to Stem Myopia

Photo by Ksenia Chernaya

Capping ten years of work to stem the tide of myopia, David Berntsen, Professor of Optometry at the University of Houston, is reporting that his team’s method to slow myopia not only works – but lasts.

The original Bifocal Lenses In Nearsighted Kids (BLINK) Study showed that having children with myopia wear high-add power multifocal contact lenses slows its progression. Now, new results from the BLINK2 Study, that continued following these children, found that the benefits continue even after the lenses are no longer used.

“We found that one year after discontinuing treatment with high-add power soft multifocal contact lenses in older teenagers, myopia progression returns to normal with no loss of treatment benefit,” reports Berntsen in JAMA Ophthalmology.

The study was funded by the National Institutes of Health’s National Eye Institute with collaborators from the Ohio State University College of Optometry.

In Focus: A Major Issue

Leading the team at the University of Houston, Berntsen takes on a significant challenge. By 2050 almost 50% of the world (5 billion people) will be myopic. Myopia is associated with an increased risk of long-term eye health problems that affect vision and can even lead to blindness.

From the initial study, high-add multifocal contact lenses were found to be effective at slowing the rate of eye growth, decreasing how myopic children became. Because higher amounts of myopia are associated with vision-threatening eye diseases later in life, like retinal detachment and glaucoma, controlling its progression during childhood potentially offers an additional future benefit.

“There has been concern that the eye might grow faster than normal when myopia control contact lenses are discontinued. Our findings show that when older teenagers stop wearing these myopia control lenses, the eye returns to the age-expected rate of growth,” said Berntsen.

“These follow-on results from the BLINK2 Study show that the treatment benefit with myopia control contact lenses have a durable benefit when they are discontinued at an older age,” said BLINK2 study chair, Jeffrey J. Walline, associate dean for research at the Ohio State University College of Optometry.

Eye Science

Myopia occurs when a child’s developing eyes grow too long from front to back. Instead of focusing images directly on the retina, they are focused at a point in front of the retina.

Single vision prescription glasses and contact lenses can correct myopic vision, but they fail to treat the underlying problem, which is the eye continuing to grow longer than normal. By contrast, soft multifocal contact lenses correct myopic vision in children while simultaneously slowing myopia progression by slowing eye growth.

Designed like a bullseye, multifocal contact lenses focus light in two basic ways. The centre portion of the lens corrects nearsightedness so that distance vision is clear, and it focuses light directly on the retina. The outer portion of the lens adds focusing power to bring the peripheral light into focus in front of the retina. Animal studies show that bringing light to focus in front of the retina may slow growth. The higher the reading power, the further in front of the retina it focuses peripheral light.

BLINK Once…Then Twice

In the original BLINK study, 294 myopic children, ages 7 to 11 years, were randomly assigned to wear single vision contact lenses or multifocal lenses with either high-add power (+2.50 diopters) or medium-add power (+1.50 diopters). They wore the lenses during the day as often as they could comfortably do so for three years. All participants were seen at clinics at the Ohio State University, Columbus, or at the University of Houston.

After three years in the original BLINK study, children in the high-add multifocal contact lens group had shorter eyes compared to the medium-add power and single-vision groups, and they also had the slowest rate of myopia progression and eye growth.

Of the original BLINK participants, 248 continued in BLINK2, during which all participants wore high-add (+2.50 diopters) lenses for two years, followed by single-vision contact lenses for the third year of the study to see if the benefit remained after discontinuing treatment.

At the end of BLINK2, axial eye growth returned to age-expected rates. While there was a small increase in eye growth of 0.03 mm/year across all age groups after discontinuing multifocal lenses, it is important to note that the overall rate of eye growth was no different than the age-expected rate. There was no evidence of faster than normal eye growth.

Participants who had been in the original BLINK high-add multifocal treatment group continued to have shorter eyes and less myopia at the end of BLINK2. Children who were switched to high-add multifocal contact lenses for the first time during BLINK2 did not catch up to those who had worn high-add lenses since the start of the BLINK Study when they were 7 to 11 years of age.

By contrast, studies of other myopia treatments, such as atropine drops and orthokeratology lenses that are designed to temporarily reshape the eye’s outermost corneal layer, showed a rebound effect (faster than age-normal eye growth) after treatment was discontinued.

“Our findings suggest that it’s a reasonable strategy to fit children with multifocal contact lenses for myopia control at a younger age and continue treatment until the late teenage years when myopia progression has slowed,” said Berntsen.

Source: University of Houston

Brains of People with Sickle Cell Disease Appear Older

Sickle cell disease. Credit: National Institutes of Health

Individuals with sickle cell disease are at a higher risk for stroke and resulting cognitive disability. But even in the absence of stroke, many such patients struggle with remembering, focusing, learning and problem solving, among other cognitive problems, with many facing challenges in school and in the workplace.

Now a multidisciplinary team of researchers and physicians at Washington University School of Medicine in St. Louis has published a study that helps explain how the illness might affect cognitive performance in sickle cell patients without a history of stroke. The study, appearing in JAMA Network Open, found such participants had brains that appeared older than expected for their age. Individuals experiencing economic deprivation, who struggle to meet basic needs, even in the absence of sickle cell disease, had more-aged appearing brains, the team also found.

“Our study explains how a chronic illness and low socioeconomic status can cause cognitive problems,” said Andria Ford, MD, a professor of neurology and chief of the section of stroke and cerebrovascular diseases at WashU Medicine and corresponding author on the study. “We found that such factors could impact brain development and/or aging, which ultimately affects the mental processes involved in thinking, remembering and problem solving, among others. Understanding the influence that sickle cell disease and economic deprivation have on brain structure may lead to treatments and preventive measures that potentially could preserve cognitive function.”

More than 200 young, Black adults with and without sickle cell disease, living in St. Louis and the surrounding region in eastern Missouri and southwestern Illinois, participated in brain MRI scans and cognitive tests. The researchers – including Yasheng Chen, DSc, an associate professor of neurology at WashU Medicine and senior author on the study – calculated each person’s brain age using a brain-age prediction tool that was developed using MRI brain scans from a diverse group of more than 14 000 healthy people of known ages. The estimated brain age was compared with the individual’s actual age.

The researchers found that participants with sickle cell disease had brains that appeared an average of 14 years older than their actual age. Sickle cell participants with older-looking brains also scored lower on cognitive tests.

The study also found that socioeconomic status correlates with brain age. On average, a seven-year gap was found between the brain age and the participants’ actual age in healthy individuals experiencing poverty. The more severe the economic deprivation, the older the brains of such study subjects appeared.

Healthy brains shrink as people age, while premature shrinking is characteristic of neurological illnesses such as Alzheimer’s disease. But a smaller brain that appears older can also result from stunted growth early in life. Sickle cell disease is congenital, chronically depriving the developing brain of oxygen and possibly affecting its growth from birth. Also, children exposed to long-term economic deprivation and poverty experience cognitive challenges that affect their academic performance, Ford explained.

As a part of the same study, the researchers are again performing cognitive tests and scanning the brains of the same healthy and sickle cell participants three years after their first scan to investigate if the older-looking brains aged prematurely, or if their development was stunted.

“A single brain scan helps measure the participants’ brain age only in that moment,” said Ford, who treats patients at Barnes-Jewish Hospital. “But multiple time points can help us understand if the brain is stable, initially capturing differences that were present since childhood, or prematurely aging and able to predict the trajectory of someone’s cognitive decline. Identifying who is at greatest risk for future cognitive disability with a single MRI scan can be a powerful tool for helping patients with neurological conditions.”

Source: WashU Medicine

Inflammation may Explain the Prevalence of IBD in Psoriasis Sufferers

Irritable bowel syndrome. Credit: Scientific Animations CC4.0

People with psoriasis often have invisible inflammation in the small intestine with an increased propensity for ‘leaky gut’, according to new research at Uppsala University. These changes in the gut could explain why psoriasis sufferers often have gastrointestinal problems and are more prone to developing Crohn’s disease. The study is published in Biochimica et Biophysica Acta (BBA) – Molecular Basis of Disease.

Psoriasis is a hereditary, chronic skin condition that can also result in inflammation of the joints. Chronic inflammatory bowel diseases (IBD), especially Crohn’s disease, are more common in patients with psoriasis than in the rest of the population.

“Previous research has also shown that people with psoriasis have more gastrointestinal problems than the general population. However we didn’t know much about why this is the case. With our study, we can now show that people with psoriasis often have invisible inflammation in their small intestines, with an increased risk of what’s called leaky gut,” says Maria Lampinen, researcher at Uppsala University.

Pro-inflammatory activity in the gut

The study involved 18 patients with psoriasis and 15 healthy controls as subjects. None of the participants had been diagnosed with gastrointestinal diseases. Samples were taken from both their small and large bowel. The researchers then studied different types of immune cells in the mucous membrane.

“It turned out that psoriasis sufferers had higher numbers of certain types of immune cells in their small intestine, and the cells showed signs of pro-inflammatory activity. Interestingly, we found the same type of immune cells in skin flare-ups from psoriasis patients, suggesting that the inflammation of the skin may have an impact on the gut, or vice versa.

Increased propensity for leaky gut

Normally, the intestinal mucosa act as a protective barrier that also allows nutrients and water to pass through it. In some autoimmune diseases, the intestinal barrier may function poorly. This is called having a leaky gut, and leads to bacteria and harmful substances leaking through the intestinal barrier and causing inflammation. This can also cause more widespread inflammation when these substances are spread via the bloodstream.

Half of the psoriasis patients in the study had increased intestinal barrier permeability or leaky gut. These same patients also reported more gastrointestinal symptoms such as abdominal pain and bloating than patients with a normal intestinal barrier. They also had elevated levels of inflammatory substances in their intestines.

“Given that the psoriasis patients in our study had relatively mild skin disease and showed no visible intestinal inflammation in a gastroscopy, they had surprisingly clear changes in their small intestine compared to healthy controls. These changes could explain why psoriasis sufferers often have gastrointestinal problems, and an increased risk of developing Crohn’s disease.

Source: Uppsala University

Intermuscular Fat Raises the Risk of Heart Attack or Failure

Photo by I Yunmai on Unsplash

People with intermuscular fat are at a higher risk of dying or being hospitalised from a heart attack or heart failure, regardless of their body mass index, according to research published in the European Heart Journal.

This intermuscular fat is highly prized in beef steaks for cooking but little is known about it in humans, and its impact on health. This is the first study to comprehensively investigate the effects of fatty muscles on heart disease.

The new finding adds evidence that existing measures, such as body mass index or waist circumference, are not adequate to evaluate the risk of heart disease accurately for all people.

The new study was led by Professor Viviany Taqueti, Director of the Cardiac Stress Laboratory at Brigham and Women’s Hospital and Faculty at Harvard Medical School, Boston, USA. She said: “Obesity is now one of the biggest global threats to cardiovascular health, yet body mass index – our main metric for defining obesity and thresholds for intervention – remains a controversial and flawed marker of cardiovascular prognosis. This is especially true in women, where high body mass index may reflect more ‘benign’ types of fat.

“Intermuscular fat can be found in most muscles in the body, but the amount of fat can vary widely between different people. In our research, we analyse muscle and different types of fat to understand how body composition can influence the small blood vessels or ‘microcirculation’ of the heart, as well as future risk of heart failure, heart attack and death.”

The new research included 669 people who were being evaluated at the Brigham and Women’s Hospital for chest pain and/or shortness of breath and found to have no evidence of obstructive coronary artery disease (where the arteries that supply the heart are becoming dangerously clogged). These patients had an average age of 63. The majority (70%) were female and almost half (46%) were non-white.

All the patients were tested with cardiac positron emission tomography/computed tomography (PET/CT) scanning to assess how well their hearts were functioning. Researchers also used CT scans to analyse each patient’s body composition, measuring the amounts and location of fat and muscle in a section of their torso.

To quantify the amount of fat stored within muscles, researchers calculated the ratio of intermuscular fat to total muscle plus fat, a measurement they called the fatty muscle fraction.

Patients were followed up for around six years and researchers recorded whether any patients died or were hospitalised for a heart attack or heart failure.

Researchers found that people with higher amounts of fat stored in their muscles were more likely to have damage to the tiny blood vessels that serve the heart (coronary microvascular dysfunction or CMD), and they were more likely to go on to die or be hospitalised for heart disease. For every 1% increase in fatty muscle fraction, there was a 2% increase in the risk of CMD and a 7% increased risk of future serious heart disease, regardless of other known risk factors and body mass index.

People who had high levels of intermuscular fat and evidence of CMD were at an especially high risk of death, heart attack and heart failure. In contrast, people with higher amounts of lean muscle had a lower risk. Fat stored under the skin (subcutaneous fat) did not increase the risk.

Professor Taqueti said: “Compared to subcutaneous fat, fat stored in muscles may be contributing to inflammation and altered glucose metabolism leading to insulin resistance and metabolic syndrome. In turn, these chronic insults can cause damage to blood vessels, including those that supply the heart, and the heart muscle itself.

“Knowing that intermuscular fat raises the risk of heart disease gives us another way to identify people who are at high risk, regardless of their body mass index. These findings could be particularly important for understanding the heart health effects of fat and muscle-modifying incretin-based therapies, including the new class of glucagon-like peptide-1 receptor agonists.

“What we don’t know yet is how we can lower the risk for people with fatty muscles. For example, we don’t know how treatments such as new weight-loss therapies affect fat in the muscles relative to fat elsewhere in the body, lean tissue, and ultimately the heart.”

Professor Taqueti and her team are assessing the impact of treatments strategies including exercise, nutrition, weight-loss drugs or surgery, on body composition and metabolic heart disease.

In an accompanying editorial, Dr Ranil de Silva from Imperial College London and colleagues said: “Obesity is a public health priority. Epidemiologic studies clearly show that obesity is associated with increased cardiovascular risk, though this relationship is complex.

“In this issue of the Journal, Souza and colleagues hypothesise that skeletal muscle quantity and quality associate with CMD and modify its effect on development of future adverse cardiovascular events independent of body mass index (BMI).

“In this patient population who were predominantly female and had a high rate of obesity, the main findings were that increasing levels of intermuscular adipose tissue (IMAT) were associated with a greater occurrence of CMD, and that the presence of both elevated IMAT and CMD was associated with the highest rate of future adverse cardiovascular events, with this effect being independent of BMI.

“The interesting results provided by Souza et al are hypothesis generating and should be interpreted in the context of several limitations. This is a retrospective observational study. Whilst a number of potential mechanisms are suggested to explain the relationship between elevated IMAT and impaired coronary flow reserve, these were not directly evaluated. In particular, no details of circulating inflammatory biomarkers, insulin resistance, endothelial function, diet, skeletal muscle physiology, or exercise performance were given.

“The data presented by Souza et al. are intriguing and importantly further highlight patients with CMD as a population of patients at increased clinical risk. Their work should stimulate further investigation into establishing the added value of markers of adiposity to conventional and emerging cardiac risk stratification in order to identify those patients who may benefit prognostically from targeted cardiometabolic interventions.”

Source: European Society of Cardiology

New Potential Treatment for Inherited Blinding Disease Retinitis Pigmentosa

Researchers used a computer screening approach to identify two compounds that could help prevent vision loss in people with a genetic eye disease

Photoreceptor cells in the retina. Credit: Scientific Animations

Two new compounds may be able to treat retinitis pigmentosa, a group of inherited eye diseases that cause blindness. The compounds, described in a study published January 14th in the open-access journal PLOS Biology by Beata Jastrzebska from Case Western Reserve University, US, and colleagues, were identified using a virtual screening approach.

In retinitis pigmentosa, the retina protein rhodopsin is often misfolded due to genetic mutations, causing retinal cells to die off and leading to progressive blindness. Small molecules to correct rhodopsin folding are urgently needed to treat the estimated 100 000 people in the United States with the disease. Current experimental treatments include retinoid compounds, such as synthetic vitamin A derivatives, which are sensitive to light and can be toxic, leading to several drawbacks.

In the new study, researchers utilised virtual screening to search for new drug-like molecules that bind to and stabilise the structure of rhodopsin to improve its folding and movement through the cell. Two non-retinoid compounds were identified which met these criteria and had the ability to cross the blood-brain and blood-retina barriers. The team tested the compounds in the lab and showed that they improved cell surface expression of rhodopsin in 36 of 123 genetic subtypes of retinitis pigmentosa, including the most common one. Additionally, they protected against retinal degeneration in mice with retinitis pigmentosa.

“Importantly, treatment with either compound improved the overall retina health and function in these mice by prolonging the survival of their photoreceptors,” the authors say. However, they note that additional studies of the compounds or related compounds are needed before testing the treatments in humans.

The authors add, “Inherited mutations in the rhodopsin gene cause retinitis pigmentosa (RP), a progressive and currently untreatable blinding disease. This study identifies small molecule pharmacochaperones that suppress the pathogenic effects of various rhodopsin mutants in vitro and slow photoreceptor cell death in a mouse model of RP, offering a potential new therapeutic approach to prevent vision loss.”

Provided by PLOS

Developing a Grapefruit that Won’t Interfere with Medication Levels

Photo by Olga Petnyunene on Unsplash

Grapefruit and pummelo contain compounds called furanocoumarins that may affect the blood levels of more than 100 prescription drugs, so that people taking these medications are advised to remove these fruits from their diets. Research published in New Phytologist reveals genetic information about the synthesis of furanocoumarins in different citrus plant tissues and species and provides new insights that could be used to develop grapefruit and pummelo that lack furanocoumarins.

The research indicates that the production of furanocoumarins in citrus fruit is dependent on the integrity of a single gene within a multi-gene cluster that encodes enzymes of the 2-oxoglutarate-dependent dioxygenase family.

“This research helps us to understand why fruit of certain citrus species produce furanocoumarins and demonstrates how breeders and researchers could develop furanocoumarin-free citrus varieties,” said co–corresponding author Yoram Eyal, PhD, of the Volcani Center, in Israel.

Source: Wiley

Decoding How HIV Hijacks our Cellular Machinery

Colourised transmission electron micrograph of an HIV-1 virus particle (yellow/gold) budding from the plasma membrane of an infected H9 T cell (purple/green).

A team of scientists at the Helmholtz Institute for RNA-based Infection Research (HIRI) in Würzburg and the University of Regensburg has unveiled insights into how HIV-1 skilfully hijacks cellular machinery for its own survival. By dissecting the molecular interplay between the virus and its host, the researchers identified novel strategies that HIV-1 employs to ensure its replication while suppressing the host’s cellular defences. The study was published in the journal Nature Structural and Molecular Biology.

HIV-1, like other viruses, lacks the machinery to produce its own proteins and must rely on the host cell to translate its genetic instructions. After entering host cells, it seizes control of the translation process, which converts messenger ribonucleic acid (mRNA) into proteins. “In this study, we combined ribosome profiling, RNA sequencing and RNA structural probing to map the viral and host translational landscape and pausing during replication of the virus in unprecedented detail,” says corresponding author Neva Caliskan.

Cheat Codes of Viral Translation

One of the key findings was the discovery of previously unrecognized elements in HIV-1 RNA called upstream open reading frames (uORFs) and internal open reading frames (iORFs). These “hidden gene fragments” may play a crucial role in fine-tuning the production of viral proteins as well as the interaction with the host immune system. “For instance, uORFs and iORFs can act as regulators, ensuring precise timing and levels of protein synthesis”, explains Anuja Kibe, a postdoctoral researcher at the HIRI and first author of the study.

Another important discovery was an intricate RNA structure near the critical “frameshift site” in the viral genome. This frameshift site is essential for the virus to produce the correct proportions of two key proteins, Gag and Gag-Pol, which are necessary for assembling infectious particles and replication of HIV-1. The researchers demonstrated that this extended RNA fold not only promotes ribosome collisions upstream of the site (a mechanism that appears to regulate translation) but also maintains the frameshifting efficiency. “Our team also showed that targeting this RNA structure with antisense molecules could significantly reduce frameshift efficiency by nearly 40 percent, offering a promising new avenue for antiviral drug development”, reports Caliskan. 

A Game of Priorities

Redmond Smyth, a former Helmholtz Young Investigator Group Leader at the HIRI and currently a group leader at the Centre National de Recherche Scientifique (CNRS) in Strasbourg, France, mentions, “Interestingly, our analysis revealed that, while HIV-1 mRNAs are translated efficiently throughout infection, the virus suppresses the protein production of the host, particularly at the translation initiation stage.” This allows HIV-1 to prioritise its own needs while effectively stalling the host defence mechanisms. Thus, the virus can manipulate the host cell machinery in ways that remain robust even under stress conditions.

More Than Traffic Jams

The researchers also observed that ribosomes collide at specific regions of the viral RNA, particularly upstream of the frameshift site. “These collisions are not accidental but are instead tightly regulated pauses that may influence how ribosomes interact with downstream RNA structures,” says Florian Erhard, study co-author and Chair of Computational Immunology at the University of Regensburg.

Overall, the study provides not only a detailed map of the translational landscape of HIV-1 infected cells but also a wealth of potential targets for therapeutic intervention. The identification of RNA structures and genetic elements critical for viral replication highlights new opportunities for the development of drugs aimed at disrupting these processes. “By understanding how the virus cleverly manipulates our cells, these discoveries will bring us closer to innovative treatments that could one day turn tables and outsmart the virus itself,” Caliskan adds.

Source: Helmholtz Centre for Infection Research

Brain Changes in Huntington’s Disease Seen Decades ahead of Symptoms

Photo by Robina Weermeijer on Unsplash

Subtle changes in the brain, detectable through advanced imaging, blood and spinal fluid analysis, happen approximately twenty years before a clinical motor diagnosis in people with Huntington’s disease, finds a new study led by UCL researchers which appears in Nature Medicine.

The team found that although functions such as movement, thinking or behaviour remained normal for a long time before the onset of symptoms in Huntington’s disease, subtle changes to the brain were taking place up to two decades earlier. These findings pave the way for future preventative clinical trials, offer hope for earlier interventions that could preserve brain function and improve outcomes for individuals at risk of Huntington’s disease.

Huntington’s disease is a devastating neurodegenerative condition affecting movement, thinking and behaviour. It is a genetic disease and people with an affected parent have a 50% chance of inheriting the Huntington’s disease mutation, meaning they will develop disease symptoms – typically in mid-adulthood.

The disease is caused by repetitive expansions of three DNA blocks (C, A and G) in the huntingtin gene. This sequence tends to continually expand in certain cells over a person’s life, in a process known as somatic CAG expansion. This ongoing expansion accelerates neurodegeneration, making brain cells more vulnerable over time.

For the new study, the researchers studied 57 people with the Huntington’s disease gene expansion, who were calculated as being on average 23.2 years from a predicted clinical motor diagnosis.  

They were examined at two time points over approximately five years to see how their bodies and brains changed over time. Their results were compared to 46 control participants, matched closely for age, sex and educational level.

As part of the study, all participants volunteered to undergo comprehensive assessments of their thinking, movement and behaviour, alongside brain scans and blood and spinal fluid sampling.

Importantly, the group with Huntington’s disease gene expansion showed no decline in any clinical function (thinking, movement or behaviour) during the study period, compared to the closely matched control group.

However, compared to the control group, subtle changes were detected in brain scans and spinal fluid biomarkers of those with Huntington’s disease gene expansion. This indicates that the neurodegenerative process begins long before symptoms are evident and before a clinical motor diagnosis.

Specifically, the researchers identified elevated levels of neurofilament light chain (NfL), a protein released into the spinal fluid when neurons are injured, and reduced levels of proenkephalin (PENK), a neuropeptide marker of healthy neuron state that could reflect changes in the brain’s response to neurodegeneration.

Lead author, Professor Sarah Tabrizi (UCL Huntington’s Disease Research Centre, UCL Queen Square Institute of Neurology, and UK Dementia Research Institute at UCL), said: “Our study underpins the importance of somatic CAG repeat expansion driving the earliest neuropathological changes of the disease in living humans with the Huntington’s disease gene expansion. I want to thank the participants in our young adult study as their dedication and commitment over the last five years mean we hope that clinical trials aimed at preventing Huntington’s disease will become a reality in the next few years.”

The findings suggest that there is a treatment window, potentially decades before symptoms are present, where those at risk of developing Huntington’s disease are functioning normally despite having detectable measures of subtle, early neurodegeneration. Identifying these early markers of disease is essential for future clinical trials in order to determine whether a treatment is having any effect.

Co-first author of the study, Dr Rachael Scahill (UCL Huntington’s Disease Research Centre and UCL Queen Square Institute of Neurology) said: “This unique cohort of individuals with the Huntington’s disease gene expansion and control participants provides us with unprecedented insights into the very earliest disease processes prior to the appearance of clinical symptoms, which has implications not only for Huntington’s disease but for other neurodegenerative conditions such as Alzheimer’s disease.”

This study is the first to establish a direct link between somatic CAG repeat expansion, measured in blood, and early brain changes in humans, decades before clinical motor diagnosis in Huntington’s disease.

While somatic CAG expansion was already known to accelerate neurodegeneration, this research demonstrates how it actively drives the earliest detectable changes in the brain: specifically in the caudate and putamen, regions critical to movement and thinking.

By showing that somatic CAG repeat expansion changes measured in blood predicts brain volume changes and other markers of neurodegeneration, the findings provide crucial evidence to support the hypothesis that somatic CAG expansion is a key driver of neurodegeneration.

With treatments aimed at suppressing somatic CAG repeat expansion currently in development, this work validates this mechanistic process as a promising therapeutic target and represents a critical advance towards future prevention trials in Huntington’s disease.

Co-first author of the study, Dr Mena Farag (UCL Huntington’s Disease Research Centre and UCL Queen Square Institute of Neurology) added: “These findings are particularly timely as the Huntington’s disease therapeutic landscape expands and progresses toward preventive clinical trials.”

The research was done in collaboration with experts at the Universities of Glasgow, Gothenburg, Iowa, and Cambridge.

Source: University College London

Windows in ICU Rooms Increase the Risk of Post-surgical Delirium

Photo by Rodnae Productions on Pexels

Delirium is a condition common in the post-surgical intensive care unit (ICU) setting, affecting up to 50-70% of those admitted, depending on individual risk profiles. ICU delirium can be associated with a multitude of factors including underlying and acute medical conditions, pharmacologic agents or treatment regimens like surgery. Currently there is no definitive consensus on drug interventions that aid in the prevention of delirium or its treatment.

While there has been some evidence that the ICU environment plays a role in delirium, more research is needed to understand this association. In a new study appearing in Critical Care Medicine, researchers found windowed patient rooms were associated with an increase in the odds of developing delirium, when compared to patient rooms without windows.

Using electronic medical records, researchers from Mass General Brigham and collaborators at Boston University Chobanian & Avedisian School of Medicine reviewed the association between patients being admitted to an ICU room with or without windows and the presence of delirium. Delirium was observed in 21% (460/2235) of patients in windowed rooms and 16% (206/1292) of patients in non-windowed rooms.

“While the findings of the study were ultimately unexpected due to prior research suggesting the importance of circadian rhythm while in the hospital, our results contribute to a growing body of evidence-based design literature around the importance of healthcare design to patient experience and outcomes,” explained corresponding author Diana Anderson, MD, FACHA, assistant professor of neurology at the school. She notes that because of the study design, these unexpected findings are not causative and may represent different patterns in which some patients – who are potentially at an increased risk of delirium – may be assigned to different room layouts by the clinical teams.

According to the authors, further research into the specific qualities of windows that may impact health is needed to better understand these results. “Although this study adds to our understanding of the relationship between delirium and characteristics of the built environment, it is clear that additional studies may provide further insight to understand these results. For example, it is possible that the window view toward adjacent landscapes or buildings may be important context to interpret these findings, or perhaps another feature of the room such as light or sound that we could consider in our next investigation,” Anderson says.

Source: Boston University School of Medicine

Survey Sheds Light on the Phenomenon of Topical Steroid Withdrawal

Source: Pixabay

Painful skin and trouble sleeping are among the problems reported when tapering cortisone cream for atopic eczema, as shown by a study headed by the University of Gothenburg. Many users consider the problems to be caused by cortisone dependence.

Topical steroid withdrawal (TSW) is a phenomenon commonly described as extremely red and painful skin arising when cortisone cream treatment is tapered or stopped.

While TSW is not an established diagnosis, the name indicates that the skin has become dependent on cortisone. Little research has been conducted to identify a dependency mechanism, so scientific support is lacking. At the same time, the term has become commonplace on social media, raising concerns among patients about cortisone cream safety.

Now, a national research group in Sweden, headed by Sahlgrenska Academy at the University of Gothenburg, has conducted the first study in which a larger group has been asked to provide a detailed account of what they consider to be TSW. The results are published in the journal Acta Dermato-Venereologica.

Questionnaire via social media

The study targeted adults with atopic eczema, a group that often uses cortisone cream, who also identified as suffering from TSW. The study was conducted by means of an anonymous questionnaire presented in Swedish in social media forums, with the option to share a link to invite other potential participants. The questionnaire was answered by almost one hundred people aged 18–39, the majority of whom were women.

“We wanted to gain more knowledge about how those who identify as suffering from TSW define the phenomenon and which symptoms they describe,” says Mikael Alsterholm, a researcher at the University of Gothenburg and a senior consultant in dermatology and venereology at Sahlgrenska University Hospital.

The results show variations in how the participants defined TSW. Most common was to define it as a dependence on cortisone, with symptoms arising when tapering or stopping its use, although many others also defined TSW as a reaction to cortisone already during its use.

It was also common to define TSW on the basis of the symptoms seen in the skin, such as redness and pain. While the symptoms described varied, they were largely similar to the symptoms seen in an exacerbation of atopic eczema.

In addition to the skin becoming red, dry, and blistered – mainly on the face, neck, torso, and arms – the participants also described sleep problems due to itching as well as signs of anxiety and depression.

Healthcare and research involvement

A majority of the participants described concurrent symptoms of both atopic eczema and TSW. Cortisone cream was most often cited as the triggering factor, while some cited cortisone tablets and a few cortisone-free treatments.

“It’s important that healthcare professionals and researchers are involved in the discussion on TSW and contribute science-based knowledge where possible. Cortisone cream is an effective and safe treatment for most people, and at present there’s no support for avoiding its use for fear of the types of symptoms described in the context of TSW,” says Mikael Alsterholm.

“At the same time, there’s a patient group with different experiences, expressed as TSW, and their symptoms and the potential causes need to be investigated by means of both research and practical healthcare. To do this, we first need to define TSW. While we understand that this is complicated, we hope that this study can help establish such a definition,” he concludes.

Source: University of Gothenburg