University of Liverpool researchers have worked with global partners to identify and successfully implement an intervention package that has significantly improved the diagnosis and management of brain infections in hospitals across Brazil, India, and Malawi.
The study, published in The Lancet, was coordinated by researchers at the University of Liverpool in collaboration with international partners and implemented across 13 hospitals.
The intervention included:
• A clinical algorithm which offered a flowchart of guidance for clinicians on how to manage the first crucial hours and days of suspected brain infections, including which tests (blood tests, lumbar puncture, brain scans) and treatments to administer. • A lumbar puncture pack, providing clinicians with sample containers, equipment, and guidance to ensure proper cerebrospinal fluid collection and testing, addressing challenges like knowing how much fluid to take and which tests to request. • A panel of laboratory tests to enable correct and timely testing for a wide range of pathogens, addressing gaps in availability and sequencing of tests, with the main goal of identifying the cause of infection. • Training for clinicians and lab staff to enhance their knowledge and skills in diagnosing and managing brain infections, including proper use of the new intervention tools.
These measures led to significant improvements in diagnosing patients with suspected acute brain infections, such as encephalitis and meningitis. Both conditions cause significant mortality and morbidity, especially in low- and middle-income countries (LMICs), where diagnosis and management are hindered by delayed lumbar punctures, limited testing, and resource constraints. Improved diagnosis and optimal management are a focus for the World Health Organization (WHO) in tackling meningitis and reducing the burden of encephalitis.
As a result of the intervention package, the proportion of patients receiving a syndromic diagnosis (confirming they had a brain infection) increased from 77% to 86%, while the microbiological diagnosis rate (identifying the exact pathogen) rose from 22% to 30%. In addition to improving diagnosis, the intervention enhanced the performance of lumbar punctures, optimised initial treatment, and improved patients’ functional recovery after illness.
Lead author Dr Bhagteshwar Singh, Clinical Research Fellow, Clinical Infection, Microbiology & Immunology said: “Following patients and their cerebrospinal fluid (CSF) samples through the hospital system, we tailored our intervention to address key gaps in care. The results speak for themselves: better diagnosis, better management, and ultimately, better outcomes for patients. Unlike most studies, we embedded improvements into routine care, so the impact continues well beyond the study.”
Corresponding author Professor Tom Solomon, Chair of Neurological Science at the University of Liverpool and Director of The Pandemic Institute, added: “We increased microbiological diagnoses by one-third across very diverse countries, which has profound implications for treatment and public health globally. As we scale this up in more hospitals and feed it into national and international policy, including WHO’s work on defeating meningitis and controlling encephalitis, the potential impact is enormous.”
The intervention was co-designed by clinicians, lab specialists, hospital administrators, researchers, and policymakers in each country, ensuring it was feasible and sustainable. Professor Priscilla Rupali, lead researcher from Christian Medical College, Vellore, India, also commented: “The co-design process ensured that the intervention would work within local healthcare settings and could be sustained beyond the study. We are already incorporating the findings into India’s national Brain Infection Guidelines, ensuring long-term benefits for patient care.”
After the US slashed global aid, the South African government stated that only 17% of its HIV spending relied on US funding. But some experts argue that US health initiatives had more bang for buck than the government’s programmes. Jesse Copelyn looks past the 17% figure, and considers how the health system is being affected by the loss of US money.
In the wake of US funding cuts for global aid, numerous donor-funded health facilities in South Africa have shut down and government clinics have lost thousands of staff members paid for by US-funded organisations. This includes nurses, social workers, clinical associates and HIV counsellors.
Spotlight and GroundUp have obtained documents from a presentation by the National Health Department during a private meeting with PEPFAR in September. The documents show that in 2024, the US funded nearly half of all HIV counsellors working in South Africa’s public primary healthcare system. The data excludes the Northern Cape.
Counsellors test people for HIV and provide information and support to those who test positive. They also follow up with patients who have stopped taking their antiretrovirals (ARVs), so that they can get them back on treatment.
Overall, the US funded 1,931 counselors across the country, the documents show. Now that many of them have been laid off, researchers say the country will test fewer people, meaning that we’ll miss new HIV infections. It also means we’ll see more treatment interruptions, and thus more deaths.
PEPFAR also funded nearly half of all data capturers, according to the documents. This amounted to 2,669 people. Data capturers play an essential role managing and recording patient files. With many of these staff retrenched, researchers say our ability to monitor the national HIV response has been compromised.
These staff members had all been funded by the US President’s Emergency Plan for AIDS Relief (PEPFAR). The funds were distributed to large South African non-government organisations (NGOs), who then hired and deployed the staff in government clinics where there is a high HIV burden. Some NGOs received PEPFAR funds to operate independent health facilities that served high-risk populations, like sex workers and LGBTQ people.
But in late January, the US paused almost all international aid funding pending a review. PEPFAR funds administered by the US Centres for Disease Control (CDC) have since resumed, but those managed by the US Agency for International Development (USAID) have largely been terminated. As a result, many of these staff have lost their jobs.
The national health department has tried to reassure the public that the country’s HIV response is mostly funded by the government, with 17% funded by PEPFAR – currently about R7.5 billion a year. But this statistic glosses over several details and obscures the full impact of the USAID cuts.
Issue 1: Some districts were heavily dependent on US funds
The first issue is that US support isn’t evenly distributed across the country. Instead, PEPFAR funding is targeted at 27 ‘high-burden districts’ – in these areas, the programme almost certainly accounted for much more than 17% of HIV spending. Some of these districts get their PEPFAR funds from the CDC, and have been less affected, but others got them exclusively from USAID. In these areas, the HIV response was heavily dependent on USAID-funded staff, all of whom disappeared overnight.
Johannesburg is one such district. A doctor at a large public hospital in this city told Spotlight and GroundUp that USAID covered a substantial proportion of the doctors, counsellors, clerks, and other administrative personnel in the hospital’s HIV clinic. “All have either had their contracts terminated or are in the process of doing so.”
The hospital’s HIV clinic lost eight counselors, eight data capturers, a clinical manager, and a medical officer (a non-specialised doctor). He said that this represented half of the clinic’s doctors and counselors, and about 80% of the data capturers.
This had been particularly devastating because it was so abrupt, he said. An instruction by the US government in late-January required all grantees to stop their work immediately.
“There was no warning about this, had we had time, we could have made contingency plans and things wouldn’t be so bad,” he explained. “But if it happens literally overnight, it’s extremely unfair on the patients and remaining staff. The loss of capacity is significant.”
He said that nurses have started to take on some of the tasks that were previously performed by counselors, such as HIV testing. But these services haven’t recovered fully and things were still “chaotic”.
He added, “It’s not as if the department has any excess capacity, so [when] nurses are diverted to do the testing and counselling, then other parts of care suffer.”
Issue 2: PEPFAR programmes got bang for buck
Secondly, while PEPFAR may only have contributed 17% of the country’s total HIV spend, some researchers believe that it achieved more per dollar than many of the health department’s programmes.
Professor Francois Venter, who runs the Ezintsha research centre at WITS university, argued that PEPFAR programmes were comparatively efficient because they were run by NGOs that needed to compete for US funding.
“PEPFAR is a monster to work for,” said Venter, who has previously worked for PEPFAR-funded groups. “They put targets in front of these organisations and say: ‘if you don’t meet them in the next month, we’ll just give the money to your competitors’ and you’ll be out on the streets … So there’s no messing around.”
US funding agencies, he said, would closely monitor progress to see if organisations were meeting these targets.
“You don’t see that with the rest of the health system, which just bumbles along with no real metrics,” said Venter.
“The health system in South Africa, like most health systems, is not terribly well monitored or well directed. When you look at what you get with every single health dollar spent on the PEPFAR program, it’s incredibly good value for money,” he said.
Not only were the programmes arguably well managed, but PEPFAR funds were also strategically targeted. Public health specialist Lynne Wilkinson provided the example of the differentiated service delivery programme. This is run by the health department, but supported by PEPFAR in one key way.
Wilkinson explained that once patients are clinically stable and virally suppressed they don’t need to pick up their ARVs from a health facility each month as it’s too time-consuming both for them and the facility. As a result, the health department created a system of “differentiated service delivery”, in which patients instead pick up their medication from external sites (like pharmacies) without going through a clinical evaluation each time. But Wilkinson noted that before someone can be enrolled in that service delivery model, clinicians need to check that patients are eligible.
“Because [the enrollment process] was going very slowly … this was supplemented by PEPFAR-funded clinicians who would go into a clinic and review a lot of clients, and get them into that system”. By doing this, PEPFAR-funded staff successfully resolved a major bottleneck in the system, she said, reducing the number of people in clinics, and thus cutting down on waiting times.
Not everyone is as confident about the overall PEPFAR model. The former deputy director of the national health department, Dr Yogan Pillay, told Spotlight and GroundUp that we don’t have data on how efficient PEPFAR programmes are at the national level. This needs to be investigated before the health department spends its limited resources on trying to revive or replicate the programmes, argued Pillay who is now the director for HIV and TB delivery at the Gates Foundation.
While he said that many PEPFAR-funded initiatives were providing crucial services, Pillay also argued that “the management structure of the [recipient] NGOs is too top-heavy and too expensive” for the government to fund. Ultimately, we need to consider and evaluate a variety of HIV delivery models instead of rushing to replicate the PEPFAR ones, he said.
Issue 3: PEPFAR supported groups that the government doesn’t reach
An additional issue obscured by the 17% figure is that PEPFAR specifically targeted groups of people that are most likely to contract and transmit HIV, like people who inject drugs, sex workers, and the LGBTQ community. These groups, called key populations, require specialised services that the government struggles to provide.
Historically, PEPFAR has given NGOs money so that they could help key populations from drop-in centres and mobile clinics, or via outreach services. All of this operated outside of government clinics, because key populations often face stigma in these settings and are thus unwilling to go there.
For instance, while about 90% of surveyed sex workers say that staff at key populations centres are always friendly and professional, only a quarter feel the same way about staff at government clinics. This is according to a 2024 report, which also found that many key populations are mistreated and discriminated against at public health facilities. (Ironically, health system monitoring organisation Ritshidze, which conducted the survey, has been gutted by US funding cuts.)
While the key populations centres funded by the CDC are still operational, those funded by USAID have closed. The health department has urged patients that were relying on these services to go to government health facilities, but researchers argue that many simply won’t do this.
Venter explained: “For years, I ran the sex worker program [at WITS RHI, which was funded by PEPFAR] … Because sex workers don’t come to [health facilities], you had to provide outreach services at the brothels. This meant … we had to deal with violence issues, we had to deal with the brothel owners, and work out which days of the week, and hours of the day we could provide the care. Logistically, it’s much more complex than sitting on your bum and waiting for them to come and visit you at the clinic.
“So you can put up your hand and say: ‘Oh they can just come to the clinics’ – like the minister said. Well, then you won’t be treating any sex workers.” Venter said this would result in a public health disaster.
He argued that one of the most crucial services that key populations may lose access to is pre-exposure prophylaxis (PrEP), a daily pill that prevents HIV.
While the vast majority of government clinics have PrEP on hand, they often fail to inform people about it. For instance, a survey of people who are at high-risk of contracting HIV in KwaZulu-Natal found that only 15% were even aware that their clinic stocked PrEP.
Another large survey found that at government facilities, only 19% of sex workers had been offered PrEP. By contrast, at the drop-in centres for key populations, the figure was more than double this, at 40%. Without these centres, the health system may lose its ability to create demand for the drug among the most high-risk groups.
One health department official told Spotlight and GroundUp that the bulk of the PrEP rollout would continue despite the US funding cuts. “The majority of the PrEP is offered through the [government] clinics,” she said, 96% of which have the drug.
However, she conceded that specific high-risk groups like sex workers have primarily gotten PrEP from the key populations centres, rather than the clinics. “This is the biggest area where we are going to see a major decline in uptake for [PrEP] services,” she said.
600 000 dead without PEPFAR?
Overall, the USAID funding cuts have severely hindered the HIV testing programmes, data capturing services, PrEP roll-out, and follow-up services for people who interrupt ARV treatment. And the patients who are most affected by this are those that are most likely to further transmit the virus.
So what will the impacts be? According to one modelling study, recently published in the Annals of Medicine, the complete loss of all PEPFAR funds could lead to over 600 000 deaths in South Africa over the next decade.
While South Africa still retains some PEPFAR funding that comes from the CDC, beneficiaries are bracing for this to end. According to Wilkinson, the PEPFAR grants of most CDC-funded organisations end in September and future grants are uncertain. For some organisations, the money stops at the end of this month.
Meanwhile, if the government has any clear plan for how to manage the crisis, it’s certainly not making this public.
In response to our questions about whether the health department would be supporting key populations centres, the department’s spokesperson, Foster Mohale, said: “For now we urge all people living with HIV/AIDS and TB to continue with treatment at public health facilities.”
When pressed for details about the department’s plans for dealing with the US cuts, Mohale simply said that they could not reveal specifics at this stage and that “this is a work in progress”.
In his budget speech in Parliament on Wednesday, Finance Minister Enoch Godongwana did not announce any funding to cover the gap left by the abrupt end of US support for the country’s HIV response. Prior to the speech, Godongwana told reporters in a briefing that the Department of Health would assist with some of the shortfall, but no further information could be provided.
Henry Adams, Country Manager South Africa, InterSystems
Healthcare data is one of the most complex and valuable assets in the modern world. Yet, despite the wealth of digital health information being generated daily, many organisations still struggle to access, integrate, and use it effectively. The promise of data-driven healthcare – where patient records, research insights, and operational efficiencies seamlessly come together – remains just that: a promise. The challenge lies in interoperability.
For years, healthcare institutions have grappled with fragmented systems, disparate data formats, and evolving regulatory requirements. The question is no longer whether to integrate but how best to do it. Should healthcare providers build, rent, or buy their data integration solutions? Each approach has advantages and trade-offs, but long-term success depends on choosing a solution that balances control, flexibility, and cost-effectiveness.
Why Interoperability Remains a Challenge
Despite significant advancements in standardisation, interoperability remains a persistent challenge in healthcare. A common saying in the industry – “If you’ve seen one HL7 interface, you’ve seen one HL7 interface” – illustrates the lack of uniformity across systems. Even FHIR, the latest interoperability standard, comes with many extensions and custom implementations, leading to inconsistency.
Henry Adams, Country Manager South Africa, InterSystems
Adding to this complexity, healthcare data must meet strict security, privacy, and compliance requirements. The need for real-time data exchange, analytics, and artificial intelligence (AI) further increases the pressure on organisations to implement robust, scalable, and future-proof integration solutions.
The Build, Rent, or Buy Dilemma
When organisations decide how to approach interoperability, they typically weigh three options:
Building a solution from scratch offers full control but comes with high development costs, lengthy implementation timelines, and ongoing maintenance challenges. Ensuring compliance with HL7, FHIR, and other regulatory standards requires significant resources and expertise.
Renting an integration solution provides quick deployment at a lower initial cost but can lead to vendor lock-in, limited flexibility, and escalating costs as data volumes grow. Additionally, outsourced solutions may not prioritise healthcare-specific requirements, creating potential risks for compliance, security, and scalability.
Buying a purpose-built integration platform strikes a balance between control and flexibility. Solutions like InterSystems Health Connect and InterSystems IRIS for Health offer pre-built interoperability features while allowing organisations to customise and scale their integration as needed.
The Smart Choice: Owning Your Integration Future
To remain agile in an evolving healthcare landscape, organisations must consider the long-term impact of their integration choices. A well-designed interoperability strategy should allow for:
Customisation without complexity – Organisations should be able to tailor their integration capabilities without having to build from the ground up. This ensures they can adapt to new regulatory requirements and technological advancements.
Scalability without skyrocketing costs – A robust data platform should enable growth without the exponential cost increases often associated with rented solutions.
Security and compliance by design – Healthcare providers cannot afford to compromise on data privacy and security. A trusted interoperability partner should offer built-in compliance with international standards.
Some healthcare providers opt for platforms that combine pre-built interoperability with the flexibility to scale and customise as needed. For example, solutions designed to support seamless integration with electronic health records (EHRs), medical devices, and other healthcare systems can offer both operational efficiency and advanced analytics capabilities. The key is selecting an approach that aligns with both current and future needs, ensuring data remains accessible, secure, and actionable.
Preparing for the Future of Healthcare IT
As healthcare systems become more digital, the need for efficient, secure, and adaptable interoperability solutions will only intensify. The right integration strategy can determine whether an organisation thrives or struggles with inefficiencies, rising costs, and regulatory risks.
By choosing an approach that prioritises flexibility, control, and future-readiness, healthcare providers can unlock the full potential of their data – improving patient outcomes, driving operational efficiencies, and enabling innovation at scale.
The question isn’t just whether to build, rent, or buy – but how to create a foundation that ensures long-term success in healthcare interoperability.
Facial pain and discomfort related to the temporomandibular joint (TMJ) is the second-leading musculoskeletal disorder, after chronic back pain, affecting 8% to 12% of Americans. Current treatments for TMJ disorders are not always sufficient, leading researchers to further explore the vast nerve and vessel network connected to this joint – the second largest in the human body.
In a study published in December 2024 in the journal Pain, a research team led by Yu Shin Kim, PhD, associate professor at the The University of Health Science Center at San Antonio (UT Health San Antonio), observed for the first time the simultaneous activity of more than 3000 trigeminal ganglion (TG) neurons, which are cells clustered at the base of the brain that transmit information about sensations to the face, mouth and head.
“With our novel imaging technique and tools, we can see each individual neuron’s activity, pattern and dynamics as well as 3000 neuronal populational ensemble, network pattern and activities in real time while we are giving different stimuli,” said Kim.
When the TMJ is injured or misaligned, it sends out signals to increase inflammation to protect the joint. However, this signaling can lead to long-term inflammation of the joint and other parts of the highly connected nerve network, leading to chronic pain and discomfort. About 80% to 90% of TMJ disorders occur in women, and most cases develop between the ages of 15–50.
Activation at the cellular level
Previous animal studies observed behavioural changes related to pain, but this study was the first to record reactions at the cellular level and their activities. To see which portions of the nerve pathway respond to various types of pain, Kim’s team created different models of pain and observed the neuronal activity with high-resolution confocal imaging, which uses a high-resolution camera and scanning system to observe neurons in action.
The team discovered that during TMJ activation, more than 100 neurons spontaneously fire at the same time. Activation was observed in localised areas of the TMJ innervated to TG neurons. The localisation of this activation highlights the specific neural pathways involved in TMJ pain, offering deeper insight into how pain develops and spreads to nearby areas. The study is also the first to quantify the degree of TG neuronal sensitivity and network activities.
Potential link to migraine, headaches
Chronic TMJ pain in humans is often linked to other pain comorbidity such as migraines and other headaches. Kim’s team observed this crossover in the in vivo model as inflammation of TG neurons spread to the nearby orofacial areas. Kim’s previous research demonstrated how stress-related migraine pain originates from a certain molecule, begins in the dura and innervates throughout the dura and TG neurons. This current study and novel imaging technique further reveals potential connections between the TMJ, migraines and other headaches.
Potential of CGRP treatment
Calcitonin gene-related peptides (CGRP), molecules involved in transmitting pain signals and regulating inflammation, are often found in higher amounts in synovial fluid of TMJ disorder patients. Synovial fluid surrounds joints in the body, helping to reduce friction between bones and cartilage. Higher amounts of CGRP are often associated with increased pain and inflammation. Kim hypothesised in this study that a reduction in CGRP may reduce TMJ disorder symptoms. He found that CGRP antagonist added to the synovial fluid relieved both TMJ pain and hypersensitivity of TG neurons.
Currently, there are no Federal Drug Administration-approved medications for TMJ disorders other than non-steroidal anti-inflammatory drugs (NSAIDS). While some CGRP antagonist medications are FDA-approved for treating migraines, this study suggests these drugs may also provide relief for TMJ disorders. Confirmation of the positive effect of the drug on TMJ pain is a major leap forward in understanding how CGRP affect TMJ pain, said Kim.
“This imaging technique and tool allows us to see pain at its source – down to the activity of individual neurons – offering unprecedented insights into how pain develops and spreads. Our hope is that this approach will not only advance treatments for TMJ disorders but also pave the way for understanding and managing various chronic pain conditions more effectively,” said Kim.
New research out of Michigan State University expands on current understanding of the brain chemical dopamine, finding that it plays a role in reducing the value of memories associated with rewards. The study, published in Communications Biology, opens new avenues for understanding dopamine’s role in the brain.
The research team discovered that dopamine is involved in reshaping memories of past rewarding events – an unexpected function that challenges established theories of dopamine function.
“We discovered that dopamine plays a role in modifying how a reward-related memory is perceived over time,” said Alexander Johnson, associate professor in MSU’s Department of Psychology and lead researcher of the study.
In the study, mice were presented with an auditory cue that had previously been associated with a sweet-tasting food. This led to a retrieval of the memory associated with consuming the food. At this time, mice were made to feel temporarily unwell, similar to how you feel if you’ve eaten something that has upset your stomach.
When the mice had fully recovered, they displayed behaviour as if the sweet-tasting food had made them unwell. This occurred despite the fact that when mice were made to feel unwell, they had only retrieved the memory of the food, not the food itself. This initial finding suggests that devaluing the memory of food is sufficient to disrupt future eating of that food.
The research team next turned their attention to the brain mechanisms that could be controlling this phenomenon. Using an approach by which they could label and reactivate brain cells that were engaged when the food memory was retrieved, the researchers identified that cells producing the chemical dopamine appeared to play a particularly important role. This was confirmed through actions that manipulated and recorded dopamine neuron activity during the exercise.
“Our findings were surprising based on our prior understanding of dopamine’s function. We typically don’t tend to think of dopamine being involved in the level of detailed informational and memory processing that our study showed,” Johnson explained. “It’s a violation of what we expected, revealing that dopamine’s role is more complex than previously thought.”
The team also used computational modelling and were able to capture how dopamine signals would go about playing this role in reshaping reward memories.
“Understanding dopamine’s broader functions in the brain could provide new insights into how we approach conditions like addiction, depression and other neuropsychiatric disorders,” said Johnson. “Since dopamine is implicated in so many aspects of brain function, these insights have wide-ranging implications. In the future, we may be able to use these approaches to reduce the value of problematic memories and, as such, diminish their capacity to control unwanted behaviours.”
It is well known that consuming sugary drinks increases the risk of diabetes, but the mechanism behind this relationship is unclear. Now, in a paper published in the Cell Press journal Cell Metabolism, researchers show that metabolites produced by gut microbes might play a role.
In a long-term cohort of US Hispanic/Latino adults, the researchers identified differences in the gut microbiota and blood metabolites of individuals with a high intake of sugar-sweetened beverages. The altered metabolite profile seen in sugary beverage drinkers was associated with a higher risk of developing diabetes in the subsequent 10 years. Since some of these metabolites are produced by gut microbes, this suggests that the microbiome might mediate the association between sugary beverages and diabetes.
“Our study suggests a potential mechanism to explain why sugar-sweetened beverages are bad for your metabolism,” says senior author Qibin Qi, an epidemiologist at Albert Einstein College of Medicine. “Although our findings are observational, they provide insights for potential diabetes prevention or management strategies using the gut microbiome.”
Sugar-sweetened beverages are the main source of added sugar in the diets of US adults – in 2017 and 2018, US adults consumed an average of 34.8g of added sugar each day from sugary beverages such as soda and sweetened fruit juice. Compared to added sugars in solid foods, added sugar in beverages “might be more easily absorbed, and they have a really high energy density because they’re just sugar and water,” says Qi.
Previous studies in Europe and China have shown that sugar-sweetened beverages alter gut microbiome composition, but this is the first study to investigate whether this microbial change impacts host metabolism and diabetes risk. It’s also the first study to investigate the issue in US-based Hispanic/Latino population — a group that experiences high rates of diabetes and is known to consume high volumes of sugar-sweetened beverages.
The team used data from the ongoing Hispanic Community Health Study/Study of Latinos (HCHS/SOL), a large-scale cohort study with data from over 16 000 participants living in San Diego, Chicago, Miami, and the Bronx. At an initial visit, participants were asked to recall their diet from the past 24 hours and had blood drawn to characterise their serum metabolites. The researchers collected faecal samples and characterized the gut microbiomes of a subset of the participants (n = 3035) at a follow-up visit and used these data to identify association between sugar-sweetened beverage intake, gut microbiome composition, and serum metabolites.
They found that high sugary beverage intake, defined as two or more sugary beverages per day, was associated with changes in the abundance of nine species of bacteria. Four of these species are known to produce short-chain fatty acids: molecules that are produced when bacteria digest fibre and that are known to positively impact glucose metabolism. In general, bacterial species that were positively associated with sugary beverage intake correlated with worse metabolic traits. Interestingly, these bacteria were not associated with sugar ingested from non-beverage sources.
The researchers also found associations between sugary beverage consumption and 56 serum metabolites, including several metabolites that are produced by gut microbiota or are derivatives of gut-microbiota-produced metabolites. These sugar-associated metabolites were associated with worse metabolic traits, including higher levels of fasting blood glucose and insulin, higher BMIs and waist-to-hip ratios, and lower levels of high-density lipoprotein cholesterol (“good” cholesterol). Notably, individuals with higher levels of these metabolites had a higher likelihood of developing diabetes in the 10 years following their initial visit.
“We found that several microbiota-related metabolites are associated with the risk of diabetes,” says Qi. “In other words, these metabolites may predict future diabetes.”
Because gut microbiome samples were only collected from a subset of the participants, the researchers had an insufficient sample size to determine whether any species of gut microbes were directly associated with diabetes risk, but this is something they plan to study further.
“In the future, we want to test whether the bacteria and metabolites can mediate or at least partially mediate the association between sugar-sweetened beverages and risk of diabetes,” says Qi.
The team plans to validate their findings in other populations and to extend their analysis to investigate whether microbial metabolites are involved in other chronic health issues linked to sugar consumption, such as cardiovascular disease.
Researchers at Oregon Health & Science University have made new discoveries about amniotic fluid, which is historically not well understood in medical research due to the difficulty in obtaining it during pregnancy, especially across gestation in birthing parents.
In addition to providing much-needed cushion and protection for the foetus, amniotic fluid also aids in development of vital organs – especially the lungs, digestive tract and skin – and stabilises the temperature inside the womb.
The new study, published in the journal Research and Practice in Thrombosis and Haemostasis, found that the addition of amniotic fluid to plasma improves the blood’s ability to thicken and clot, which is a critical and likely a protective function throughout pregnancy and during delivery for both the birthing parent and the baby. It also appears to offer other unexpected functions, such as serving as a ‘pre-milk’ for foetuses.
The mechanism of amniotic fluid’s role in foetal development is not well understood and is understudied: The OHSU study is one of the first to identify how the features and properties of amniotic fluid change over time, especially those properties that play a role in thickening the blood, and how those changes can affect how maternal blood coagulates. If a pregnant person’s blood does not clot properly, it can create life-threatening complications for the foetus and birthing parent, including excessive bleeding during pregnancy and delivery.
“We have always known that amniotic fluid is very important for foetal development and growth, but we don’t know much about it beyond that,” said the study’s corresponding author Jamie Lo, MD, MCR., associate professor of obstetrics and gynaecology (maternal-foetal medicine) in the OHSU School of Medicine, and Division of Reproductive & Developmental Sciences at the Oregon National Primate Research Center, or ONPRC. “We examined amniotic fluid across the pregnancy and found that indeed the composition and proteins in the amniotic fluid do change to match the growing needs of the developing baby.”
This discovery prompted Lo and her team to work with scientists in the Department of Biomedical Engineering at OHSU to take a deeper dive into the potential protective factors of amniotic fluid, and consider potential regenerative and therapeutic uses that could be developed down the road.
The research involved a multidisciplinary team including Lo, Chih Jen Yang, MD, Lyndsey Shorey-Kendrick, PhD, Joseph Shatzel, MD, MCR, Brian Scottoline, MD, PhD, and Owen McCarty, PhD.
Researchers analysed the properties of amniotic fluid obtained by amniocentesis, a prenatal test that involves sampling a small amount of amniotic fluid to examine the health of the pregnancy, from both human and non-human primates at gestational-age matched timepoints. The findings showed that amniotic fluid increases blood clotting through key fatty acids and proteins that change each trimester and help regulate coagulation.
With the untapped potential for amniotic fluid to aid in diagnosing and treating various prenatal conditions, researchers are now collaborating with Sanjay Malhotra, PhD, professor of cell, developmental and cancer biology in the OHSU School of Medicine, to target disorders of pregnancy – including disorders that affect the blood and blood-forming organs – that could benefit from the protective properties of proteins and other compounds within amniotic fluid.
Researchers are eager to learn more about the potential uses of amniotic fluid components and how they might be harnessed to improve prenatal and maternal health.
“Babies born prematurely miss out on critical weeks developing within amniotic fluid,” said the study’s co-senior author Brian Scottoline, MD, PhD, professor of paediatrics (neonatology), OHSU School of Medicine. “But if we have a better understanding of amniotic fluid, how it develops and what properties are valuable for what functions, that opens up many new possibilities for creating new therapies.”
“Through our research, our team is learning that amniotic fluid may be a critical precursor to breast milk – almost like ‘pre-term’ milk for a foetus in utero. With that analogy, could we eventually develop a formula that’s fit for preterm babies that mimics amniotic fluid, aiding in growth and development and protecting babies from complications of being born prematurely?” Lo added. “This is really the tip of the iceberg for what’s possible.”
As many as half of nursing home residents are cognitively impaired and may be unable to communicate symptoms such as pain or anxiety to the staff and clinicians caring for them. Therefore, information needed for the evaluation of symptoms and subsequent treatment decisions typically does not reliably exist in nursing home electronic health records (EHRs).
A new paper published in the International Journal of Geriatric Psychiatry reports on the novel adaptation of a commonly used symptom assessment instrument to more comprehensively acquire this difficult-to-obtain data with the ultimate goal of enabling knowledge-based expansion of palliative care services in nursing homes to address residents’ symptoms.
In the paper, part of the large, multi-state, multi-facility Utilizing Palliative Leaders in Facilities to Transform care for people with Alzheimer’s Disease (UPLIFT-AD) study researchers, including Regenstrief Institute, the Indiana University School of Medicine and the University of Maryland School of Social Work faculty, describe how they revamped and subsequently validated a symptom assessment tool used worldwide. The UPLIFT-AD researchers modified the instrument, originally designed for reporting by family members of individuals with dementia following their death, to enable reporting on the symptoms of current residents living with moderate to severe dementia by nursing home staff as well as family.
Led by Kathleen T. Unroe, MD, MHA, and John G. Cagle, PhD, the UPLIFT-AD team reports in the peer-reviewed paper that the tool they enhanced reliably addressed physical and emotional distress as well as well-being and symptoms that are precursors to end of life. This validation was critical as the researchers develop guidance for expansion of symptom recognition and management in any nursing home. Employing instruments used in other studies helps researchers to directly compare findings.
Dr. Unroe, Dr. Cagle and colleagues, including Wanzhu Tu, PhD, of the Regenstrief Institute and the IU School of Medicine, are in the late stages of the UPLIFT-AD clinical trial to enhance quality of care individuals with dementia by building capacity for palliative care within nursing homes.
“People receive care in nursing homes because they have significant needs – support for activities of daily living – as well as for complex, serious and multiple chronic conditions. But measuring symptoms of residents, especially those who are cognitively impaired, to address these needs is challenging,” said paper senior author Dr. Unroe, a Regenstrief Institute research scientist and an IU School of Medicine professor of medicine. “In my two decades of working as a clinician in nursing homes as well as a researcher, I have seen that often the information on symptoms that we want isn’t available consistently in the data that’s already collected or it isn’t collected at the frequency that we need to measure the impact of programs and approaches. And the gold standard for knowing if someone has a symptom, for example, if someone has pain or anxiety, to ask that person directly to assess the symptom, isn’t always possible for cognitively impaired residents. That’s why we took steps to validate a commonly used instrument in a wider population – individuals currently living with cognitive impairment – and added additional needed data points.
“While hospice care is typically available, there is widespread recognition that broader palliative care is needed in nursing homes. But there is no roadmap for how to provide it well. We hope that when we have our final results in 2026, UPLIFT-AD will prove to be a replicable model for implementing this much needed type of care.”
SARS-COV-2 has been very good at mutating to keep infecting people – so good that most antibody treatments developed during the pandemic are no longer effective. Now a team led by Stanford University researchers may have found a way to pin down the constantly evolving virus and develop longer-lasting treatments.
The researchers discovered a method to use two antibodies, one to serve as a type of anchor by attaching to an area of the virus that does not change very much and another to inhibit the virus’s ability to infect cells. This pairing of antibodies was shown to be effective against the initial SARS-CoV-2 virus that caused the pandemic and all its variants through omicron in laboratory testing. The findings are detailed in the journal Science Translational Medicine.
“In the face of an ever-changing virus, we engineered a new generation of therapeutics that have the ability to be resistant to viral evolution, which could be useful many years down the road for the treatment of people infected with SARS-CoV-2,” said Christopher O. Barnes, the study’s senior author, an assistant professor of biology.
An overlooked option
The team led by Barnes and first author Adonis Rubio, a doctoral candidate in the Stanford School of Medicine, conducted this investigation using donated antibodies from patients who had recovered from COVID-19. Analysing how these antibodies interacted with the virus, they found one that attaches to a region of the virus that does not mutate often.
This area, within the Spike N-terminal domain, or NTD, had been overlooked because it was not directly useful for treatment. However, when a specific antibody attaches to this area, it remains stuck to the virus. This is useful when designing new therapies that enable another type of antibody to get a foothold and attach to the receptor-binding domain, or RBD, of the virus, essentially blocking the virus from binding to receptors in human cells.
An illustration of the bispecific antibodies the Stanford-led research team developed to neutralise the virus that causes COVID-19. Named “CoV2-biRN,” these two antibodies work together by attaching to different areas of the virus.The bispecific antibodies target two areas of the virus: One attaches to the “NTD,” or Spike N-terminal domain, an area on the virus that does not change very much. This allows the second antibody to attach to the “RBD,” or receptor-binding domain, essentially preventing the virus from infecting human cells. | Christopher O. Barnes and Adonis Rubio using Biorender stock images
The researchers designed a series of these dual or “bispecific” antibodies, called CoV2-biRN, and in laboratory tests they showed high neutralisation of all the variants of SARS-CoV-2 known to cause illness in humans. The antibodies also significantly reduced the viral load in the lungs of mice exposed to one version of the omicron variant.
More research, including clinical trials, would have to be done before this discovery could be used as a treatment in human patients, but the approach is promising – and not just for the virus that causes COVID-19.
Next, the researchers will work to design bispecific antibodies that would be effective against all coronaviruses, the virus family including the ones that cause the common cold, MERS, and COVID-19. This approach could potentially also be effective against influenza and HIV, the authors said.
“Viruses constantly evolve to maintain the ability to infect the population,” Barnes said. “To counter this, the antibodies we develop must continuously evolve as well to remain effective.”
In a decades-long study following twins, researchers from the University of Jyväskylä, Finland, investigated the links between long-term leisure-time physical activity and mortality. They also sought to determine whether physical activity can mitigate the increased risk of mortality due to genetic predisposition to diseases. Moreover, they examined the relationship between physical activity and later biological aging.
The study included 22 750 Finnish twins born before 1958 whose leisure-time physical activity was assessed in 1975, 1981 and 1990. Mortality follow-up continued until the end of 2020.
Moderate activity yields maximum longevity benefits
Four distinct sub-groups were identified from the data, which was based on leisure-time physical activity over the 15-year follow-up: sedentary, moderately active, active and highly active groups. When the differences in mortality between the groups were examined at the 30-year follow-up, it was found that the greatest benefit – a 7% lower risk of mortality – was achieved between the sedentary and moderately active groups. A higher level of physical activity brought no additional benefit.
When mortality was examined separately in the short and long term, a clear association was found in the short-term: the higher the level of physical activity, the lower the mortality risk. In the long term, however, those who were highly active did not differ from those who were sedentary in terms of mortality.
“An underlying pre-disease state can limit physical activity and ultimately lead to death, not the lack of exercise itself.”
“This can bias the association between physical activity and mortality in the short term”, says Associate Professor Elina Sillanpää from the Faculty of Sports and Health Sciences.
Meeting physical activity guidelines does not guarantee a lower mortality risk
The researchers also investigated whether following the World Health Organization’s physical activity guidelines affects mortality and genetic disease risk. The guidelines suggest 150 to 300 minutes of moderate or 75 to 150 minutes of vigorous activity weekly. The study found that meeting these guidelines did not lower mortality risk or alter genetic disease risk. Even for twins who met the recommended levels of PA over a 15-year period, no statistically significant difference in mortality rates was found compared to their less active twin pair.
“The widely observed favorable association between physical activity and mortality are based on observational studies that are prone to bias from different sources.”
“In our studies, we aimed to account for various sources of biases, and combined with the long follow-up period, we could not confirm that adhering to physical activity guidelines mitigates genetic cardiovascular disease risk or causally reduces mortality”, says postdoctoral researcher Laura Joensuu from the Faculty of Sports and Health Sciences.
Link between physical activity and biological aging is U-shaped
For the subsample of twins, biological aging was determined from blood samples using epigenetic clocks. Epigenetic clocks allow a person’s biological aging rate to be estimated based on methyl groups that regulate gene expression and are linked to aging process.
“We found that the association between leisure-time physical activity and biological aging was U-shaped: Biological aging was accelerated in those who exercised the least and the most,” says Sillanpää.
Other lifestyles, such as smoking and alcohol consumption, largely explained the favourable associations of physical activity with biological aging.
Genetic data were available for 4897 twins. The genetic susceptibility of twins to coronary artery disease, as well as systolic and diastolic blood pressure was assessed using new polygenic risk scores, which sum the genome-wide susceptibility to morbidity. In addition, all-cause and cardiovascular mortality was followed in 180 identical twin pairs. The biological aging rate of 1153 twins was assessed from a blood sample.