Month: August 2024

More Protein and Fibre While Dropping Calories is Key for Weight Loss

Photo by Andres Ayrton on Pexels

Participants on a self-directed dietary education program who had the greatest success at losing weight across a 25-month period consumed greater amounts of protein and fibre, found a study published in Obesity Science and Practice. Personalisation and flexibility also were key in creating plans that dieters could adhere to over time. 

At the one-year mark, successful dieters (41% of participants) had lost 12.9% of their body weight, compared with the remainder of the study sample, who lost slightly more than 2% of their starting weight. 

The dieters were participants in the Individualised Diet Improvement Program, which uses data visualisation tools and intensive dietary education sessions to increase dieters’ knowledge of key nutrients, enabling them to create a personalised, safe and effective weight-loss plan, said Manabu T. Nakamura, a professor of nutrition at the University of Illinois Urbana-Champaign and the leader of the research.

“Flexibility and personalisation are key in creating programs that optimise dieters’ success at losing weight and keeping it off,” Nakamura said. “Sustainable dietary change, which varies from person to person, must be achieved to maintain a healthy weight. The iDip approach allows participants to experiment with various dietary iterations, and the knowledge and skills they develop while losing weight serve as the foundation for sustainable maintenance.”

The pillars of iDip are increasing protein and fibre consumption along with consuming 1500 calories or less daily. 

Based on the dietary guidelines issued by the Institutes of Medicine, the iDip team created a one-of-a-kind, two-dimensional quantitative data visualisation tool that plots foods’ protein and fibre densities per calorie and provides a target range for each meal. Starting with foods they habitually ate, the dieters created an individualised plan, increasing their protein intake to about 80g and their fibre intake to about 20g daily.

In tracking the participants’ eating habits and their weights with Wi-Fi enabled scales, the team found strong inverse correlations between the percentages of fibre and protein eaten and dieters’ weight loss.    

“The research strongly suggests that increasing protein and fibre intake while simultaneously reducing calories is required to optimise the safety and efficacy of weight loss diets,” said first author and U. of I. alumna Mindy H. Lee, a then-graduate student and registered dietitian-nutritionist for the iDip program. 

Nakamura said the preservation of lean mass is very important while losing weight, especially when using weight-loss drugs.

 “Recently, the popularity of injectable weight loss medications has been increasing,” Nakamura said. “However, using these medications when food intake is strongly limited will cause serious side effects of muscle and bone loss unless protein intake is increased during weight loss.”

A total of 22 people who enrolled in the program completed it, including nine men and 13 women. Most of the dieters were between the ages of 30–64. Participants reported they had made two or more prior attempts to lose weight. They also had a variety of comorbidities – 54% had high cholesterol, 50% had skeletal problems and 36% had hypertension and/or sleep apnoea. Additionally, the dieters reported diagnoses of diabetes, nonalcoholic fatty liver disease, cancer and depression, according to the study.

The seven dieters who reported they had been diagnosed with depression lost significantly less weight: about 2.4% of their starting weight compared with those without depression, who lost 8.39% of their initial weight. The team found that weight loss did not differ significantly among participants with other comorbidities, or between younger and older participants or between men and women.

Body composition analysis indicated that dieters maintained their lean body mass, losing an average of 7.1kg of fat mass and minimal muscle mass at the six-month interval. Among those who lost greater than 5% of their starting weight, 78% of the weight they lost was fat, according to the study.

Overall, the participants reduced their fat mass from an average of 42.6kg at the beginning of the program to 35.7kg at the 15-month mark. Likewise, the dieters reduced their waists by about 7cm at six months and by a total of 9cm at 15 months, the team found. 

In tracking dieters’ protein and fibre intake, the team found a strong correlation between protein and fibre consumption and weight loss at three months and 12 months.

“The strong correlation suggests that participants who were able to develop sustainable dietary changes within the first three months kept losing weight in the subsequent months, whereas those who had difficulty implementing sustainable dietary patterns early on rarely succeeded in changing their diet in the later months,” Nakamura said.

The team hypothesised that this correlation could also have been associated with some dieters’ early weight loss success, which may have bolstered their motivation and adherence to their program.

Source: University of Illinois at Urbana-Champaign

Study Shows that Probiotics in Pregnancy Benefit Mothers and Offspring

Photo by SHVETS production: https://www.pexels.com/photo/focused-pregnant-black-woman-taking-vitamins-on-couch-6991899/

Giving probiotics to pregnant mice can enhance both the immune system and behaviour of the mothers and their offspring, according to a new study led by The Ohio State University Wexner Medical Center and College of Medicine.

“These results suggest that certain probiotics given to mothers during pregnancy can improve their offsprings’ behaviour and may affect the metabolism of common amino acids in our diets. Probiotics may also help counteract the negative effects of prenatal stress,” said study senior author Tamar Gur, MD, PhD, at OSU. 

Study findings are published online in the journal Brain, Behavior, and Immunity

Many studies have attested to the benefits of probiotics, which are considered safe to take during pregnancy. Researchers led by first author Jeffrey Galley, PhD found that a specific probiotic, Bifidobacterium dentium, may change how the body processes certain amino acids, such as tryptophan. During pregnancy, tryptophan helps control inflammation and brain development. 

“We have strong evidence this specific probiotic helped reduce stress-related problems in both mothers and their offspring, including helping the babies gain weight and improving their social behaviour,” said Gur, who also is an associate professor of psychiatry, neuroscience and obstetrics and gynaecology at Ohio State. 

Gur’s research team has studied how prenatal stress can lead to abnormal brain development and behavioural changes in offspring. So far, they’ve found that stress is linked to changes in brain inflammation and amino acid metabolism, as well as long-term reductions in social behaviour and abnormal microbiomes in offspring.

This study enhances their understanding of how gut microbes and probiotics can influence amino acid metabolism and help with behaviour and immune issues related to prenatal stress. The study also highlights the many benefits of this specific probiotic, even without the presence of stress.

“Now, we aim to understand the mechanisms behind these changes and explore ways to prevent or treat these effects,” Gur said. “Since prenatal stress is common in many pregnancies, we want to develop methods to reduce its negative effects.”

Source: Ohio State University Wexner Medical Center

Over Half of Iron Deficiency Cases Unresolved After Three Years

Photo by National Cancer Institute on Unsplash

Over half of people with iron deficiency were found to still have low iron levels three years after diagnosis, and among patients whose condition was effectively treated within that timeframe, they faced longer-than-expected delays, pointing to substantial gaps in appropriate recognition and efficient treatment of the condition, according to a study published in Blood Advances.

Iron deficiency is common, and affecting up to 40% of adolescents and young women. Previous work reported that up to 70% of cases go undiagnosed in high-risk populations, such as those with bleeding disorders, issues with malabsorption, or women who menstruate.

“Iron deficiency is probably a bigger problem than we realise. I’ve seen a lot of cases where people don’t have anaemia, but they are walking around with very little to no iron in their body and it can have a big impact on how people feel in their day-to-day life,” said Jacob Cogan, MD, assistant professor of medicine at the University of Minnesota and the study’s lead author. “Iron deficiency can be challenging to diagnose, but it’s easy to treat. Our findings underscore the need for a more coordinated effort to recognise and treat iron deficiency to help improve quality of life.”

If untreated, low iron stores can lead to mood changes, fatigue, hair loss, exercise intolerance, and eventually anaemia. The condition is generally first treated with oral iron supplementation, and if low iron levels persist after a few months or the patient reports side effects, intravenous (IV) iron is started.

For this study, the researchers retrospectively analysed medical records from one of Minnesota’s largest health system database and identified 13 084 adults with a laboratory diagnosis of iron deficiency (defined as a ferritin value of 25ng/mL, with and without anaemia) between 2010 and 2020 who had available follow-up data for three years.

In the study, iron deficiency was d or less. Patients had to have at least two ferritin values – one initial value and at least one more within the three-year study period. Adequate treatment and resolution was defined as a subsequent ferritin value of at least 50ng/mL. Most patients received some form of treatment, consistent across sex.

Of the 13,084 patients included in the study, 5,485 (42%) patients had normal iron levels within three years of diagnosis, while 7,599 (58%) had persisting iron deficiency based on low ferritin levels. Only 7% of patients had their iron levels return back to normal within the first year of diagnosis.

Factors associated with a higher likelihood of getting iron levels back to normal included older age (age 60 and up), male sex, Medicare insurance, and treatment with IV iron alone. Additionally, compared with patients who were still iron deficient, those whose condition was resolved had more follow-up blood work to check ferritin values (six vs four ferritin tests). Of note, younger patients, females, and Black individuals were most likely to remain iron deficient or experience longer lags in getting their iron stores back to a healthy level.

Even among patients whose iron levels were restored to normal during the study duration, it took nearly two years (the median time to resolution was 1.9 years), which researchers say is longer than expected and signals missed opportunities to more effectively manage the condition. While there was no data to look at whether anaemia iron deficiency was more apt to be treated, Dr Cogan says it’s reasonable to think this might be the case as iron deficiency without anaemia is harder to recognise.

“Two years is too long and well beyond the timeframe within which iron deficiency should be able to be sufficiently treated and resolved [with oral or IV treatments],” said Dr Cogan. “The numbers are pretty striking and suggest a need to put systems in place to better identify patients and treat them more efficiently.”

As with trends showing persisting iron deficiency, Dr Cogan attributes the delays in resolution to the diagnosis either being missed or not treated to resolution. He added that there is a clear need for education about non-anaemic iron deficiency and who is at high risk, more universal agreement on the best ferritin cut off for diagnosis, and efforts to create an iron deficiency clinic or pathway to “assess and treat patients more efficiently and get people feeling better faster.”

The study was limited by its reliance on EMR data and retrospective nature, which prevented researchers from determining why ferritin tests were ordered for patients or the cause of their iron deficiency.

Source: American Society of Hematology

Study Shows Fewer Kidney Stones with Higher Doses of Thiazides

Human kidney. Credit: Scientific Animations CC0

Higher thiazide doses are associated with greater reductions in urine calcium, which in turn correlate with fewer symptomatic kidney stone events, according to a Vanderbilt University Medical Center study out now in JAMA Network Open.  

Thiazide diuretics, commonly prescribed to prevent kidney stone recurrence, are drugs that act directly on the kidneys to promote diuresis by inhibiting the sodium/chloride cotransporter located in the distal convoluted tubule of a nephron. Thiazides are also used as a common treatment for high blood pressure and to clear fluid from the body in conditions such as heart failure. 

First author Ryan Hsi, MD, FACS, associate professor in the Department of Urology at VUMC, said the study data help explain the findings of the multicentre Hydrochlorothiazide for Kidney Stone Recurrence Prevention (NOSTONE) trial, which reported that hydrochlorothiazide did not reduce recurrence of kidney stone events.  

“In light of our research, the calcium reductions in that study were modest and likely insufficient to affect recurrence risk,” Hsi said.   

“What this means for patients is that thiazides remain an important option in the toolkit for preventing kidney stone recurrence. It may be beneficial to monitor calcium excretion while on thiazide therapy to adjust dose and diet to attain an adequate reduction in urine calcium.” 

A total of 634 participants were studied, revealing significant associations between higher thiazide doses and urine calcium reductions greater than those achieved in the NOSTONE trial, where participants took different doses of hydrochlorothiazide.  

For next steps, the researchers are interested in understanding which subtypes of thiazides and their dosing work best, and how best to optimise medication adherence, since these therapies are often administered long term.

Source: Vanderbilt University Medical Center

Eye Health Services in the Public Sector are Critically Impaired – it is High Time the Health Department Responds

Photo by Hush Naidoo Jade Photography on Unsplash

By Haseena Majid and Rene Sparks

Despite South Africa producing a substantial number of trained optometrists, the majority of them work in the private sector and in urban areas. This imbalance leaves rural communities underserved and exacerbates health inequities. Does it make sense for us to use public funds and institutions to train people predominantly for the private sector, ask Dr Haseena Majid and Rene Sparks.

Avoidable blindness and vision impairment are major global health concerns. The World Health Organization (WHO) estimates that at least 1 billion people worldwide have a vision impairment that could have been prevented or treated. In 2020, there were an estimated 11 million people living with some degree of vision loss in South Africa, of which 370 000 were classified as blind.

Avoidable blindness caused by uncorrected refractive error (vision problems that requires spectacles or contact lenses) and cataracts can be well managed in the presence of a capable work force that is both accessible and affordable to the public. As such, optometrists are crucial in combating avoidable vision loss. Their expertise in conducting comprehensive eye examinations, diagnosing and managing some eye diseases, prescribing corrective lenses, and providing preventive care is vital for reducing the burden of avoidable blindness.

But the current landscape of optometry services in South Africa reveals significant gaps in both governance and resource allocation.

The distribution of optometrists in South Africa is far from optimal. As of April 2023, there were approximately 4200 registered optometrists and 580 ophthalmologists in the country. While this is a considerable number of people trained to provide primary eye care services, the 6.7% serving the public sector – compared to 93.3% serving the private sector – is simply inadequate and has created stark disparities.

The available evidence points to an urban-rural divide in optometry services, with only around 262 optometrists employed in the public sector nationally, and disproportionately between and within provinces. It means that rural and poor communities, where a significant portion of the population resides, have very limited access to essential eye care services.

Further deepening the disparities in access to essential eye care is the government’s fragmented and inconsistent approach to eye health across provinces, resulting in some areas lacking any public eye care services, while others depend on external providers.

Training misalignment

All of these challenges come against the backdrop of substantial state investment in the training of optometrists. The government funds their training at several universities across the country. However, the majority of these graduates are absorbed into the private sector. In some instances, students trained on state bursaries struggle to get placed in the public sector.

This misalignment highlights a fundamental flaw in how public funds are utilised, with minimal benefit to the broader population that relies on public healthcare. It also contradicts the government’s mandate to provide progressive solutions to improve access to healthcare for all, as enshrined in the Constitution.

These ongoing governance gaps and the inefficient use of state resources also represent significant barriers to achieving health equity in South Africa as expressed in government’s plans for National Health Insurance (NHI). And while the implementation of NHI aims to bring our country closer towards universal health coverage, it is not yet clear whether, and to what extent, vision and eye care services will be included in the envisioned basket of services.

A lack of a clear plan could result in a missed opportunity to integrate optometrists into the primary healthcare system nationally.

What to do

Firstly, there needs to be an urgent reassessment of the costs to train optometrists against the benefits to the broader public. Are we training too many optometrists currently? Could the government initiate engagements with thought leaders and support partners to develop a community service and costing exercise to address the inequity and lack of access to eye health services, and simultaneously address the employment of optometrists within the public health space?

Secondly, the National Department of Health should establish a dedicated directorate for eye health services which should be integrated within provincial health structures. This unit should spearhead a comprehensive data collection system for vision and eye health which can be used to accurately assess needs, allocate resources, and plan effectively.

Calls for such a dedicated directorate have been made through scientific recommendation for more than a decade. But there has been no meaningful response and action from the health department and related decision-making entities.

Thirdly, the principles behind NHI offer a medium-term solution to address the disproportionate distribution of optometrists. Through the establishment of NHI-style public-private partnerships, private sector capacity can be leveraged to serve people who depend on the public sector. Such a public-private partnership will have to have transparency, accountability, and data integrity built into its structures. This will allow provinces and districts to monitor accurate data, and provide feedback that will help shape and improve services.

In summary, the health department stands at a critical juncture, where the systemic imbalances in optometrist distribution and vision care services have now become acute – with people in South Africa paying a very concrete and personal price in the form of avoidable vision loss. Delays in governance processes have historically hampered progress, but the need for swift and informed action is now paramount. The principles of public-private partnership that underlie NHI points to a solution, but the urgency of the crisis means we do not have the time to wait for the full NHI plans to be rolled out – by government’s own admission that will take many years. People losing their eyesight today simply can’t wait that long.

*Majid and Sparks are Global Atlantic fellows for Health Equity in South Africa and advocates on the National Eye Health Advocacy Project led by USAWA for learning and healing, a civil society organisation committed to reforms for health equity and social justice.

Note: Spotlight aims to deepen public understanding of important health issues by publishing a variety of views on its opinion pages. The views expressed in this article are not necessarily shared by the Spotlight editors.

Republished from Spotlight under a Creative Commons licence.

Read the original article

New Insights and Potential Treatments for Pulmonary Hypertension

Human heart. Credit: Scientific Animations CC4.0

A new study from researchers with UCLA Health and collaborating organisations has found that asporin, a protein encoded by the ASPN gene, plays a protective role in pulmonary arterial hypertension (PAH).

Their findings, out now in the peer-reviewed journal Circulation, offer new insights into this incurable, often-fatal disease and suggest potential new ways to treat it. The ASPN gene is part of a group of genes associated with the cartilage matrix.

“We were surprised to find that asporin, which previously had not been linked to PAH, gets upregulated to increased levels as a response to counteract this disease process,” said Dr Jason Hong, a pulmonary and critical care physician at UCLA Health and the study’s corresponding author. “This novel finding opens up new avenues for understanding PAH pathobiology and developing potential therapies.” 

Pulmonary hypertension is a serious medical condition characterised by high blood pressure in the arteries that supply the lungs. It causes these arteries to narrow or become blocked, which, in turn, slows blood flow to the heart, requiring it to work harder to pump blood through the lungs. Eventually, the heart muscle becomes weak and begins to fail. 

Need for New Therapies

According to recent estimates, PAH affects about 1% of the global population, but that number climbs to 10% in people who are 65 or older. 

There’s no cure for the disease, but medications and lifestyle changes can help slow progression, manage symptoms and prolong life.

The urgent need for new therapies, combined with the potential of multiomics – an integrated approach to drive discovery across multiple levels of biology – inspired Hong and research colleagues, including co-first author Lejla Medzikovic and senior author Mansoureh Eghbali to take a deep dive into the disease. Both work at UCLA’s Eghbali Laboratory.

Methodology

For the study, the researchers applied novel computational methods, including transcriptomic profiling and deep phenotyping, to lung samples of 96 PAH patients and 52 control subjects without the condition from the largest multicenter PAH lung biobank available to-date. They integrated this data with clinical information, genome-wide association studies, graphic models of probabilities and multiomics analysis.

“Our detailed analysis found higher levels of asporin in the lungs and plasma of PAH patients, which were linked to less severe disease,” Hong said.

Additionally, Medzikovic noted that their cell and living-organism experiments found that asporin inhibited pulmonary artery smooth muscle cell proliferation and a key signaling pathway that occurs with PAH.

“We also demonstrated that recombinant asporin treatment reduced PAH severity in preclinical models,” said Medzikovic.

Next Steps

Hong and colleagues plan to further investigate the mechanisms by which asporin exerts its protective effects in PAH and explore potential therapeutic applications, focusing on how to translate their findings into clinical trials.

“Asporin represents a promising new target for therapeutic intervention in pulmonary arterial hypertension,” he explained. “Enhancing asporin levels in PAH patients could potentially lead to improved clinical outcomes and reduced disease progression.”

Source: University of California – Los Angeles Health Sciences

Opinion Piece: Mitigate Risks and Enhance Efficiency – the ISO Accreditation Advantage

Photo by Scott Graham on Unsplash

By Robert Erasmus, Managing Director at Sanitech

ISO accreditation is a strategic investment that empowers businesses to enhance their competitiveness, mitigate risks, and seize new market opportunities. By adhering to globally recognised standards, organisations can build trust, streamline operations, and achieve sustainable growth. While the initial outlay may seem substantial, the long-term returns in terms of efficiency, customer satisfaction, and regulatory compliance far exceed the costs.

Building credibility and adherence to global standards

In today’s increasingly globalised business landscape, standing out from the competition is essential. ISO accreditation acts as a powerful endorsement, signifying a company’s commitment to quality, efficiency, and adherence to international best practices. By obtaining ISO certification, businesses can better mitigate risk while demonstrating their credibility and reliability to customers, suppliers, and stakeholders alike. 

How ISO standards provide a framework for best practices

ISO standards provide a structured approach to managing various aspects of an organisation’s operations, ensuring consistent performance and compliance with customer and regulatory expectations. These standards offer a comprehensive framework that guides companies in identifying, managing, and continually improving their processes, which ultimately provides an effective means of identifying and managing risk throughout the business. Here’s a brief breakdown of how specific ISO standards contribute to this:

ISO 9001: Quality Management System

  • Customer focus: Defines processes to understand customer needs and expectations, ensuring products or services meet or exceed these requirements. 
  • Process-based approach: Establishes a systematic approach to identifying, managing, and controlling processes to achieve desired outcomes. 
  • Continuous improvement: Promotes a culture of continual improvement by monitoring processes, identifying opportunities for enhancement, and implementing changes.

ISO 14001: Environmental Management System

  • Environmental Impact Assessment: Requires organisations to identify, assess, and control environmental impacts of their activities. 
  • Legal compliance: Ensures adherence to environmental laws and regulations. 
  • Resource efficiency: Promotes the efficient use of resources and waste reduction. 
  • Stakeholder engagement: Encourages dialogue with stakeholders to address environmental concerns.

ISO 22000: Food Safety Management System

  • Hazard Analysis and Critical Control Points (HACCP): Implements a systematic approach to identifying, assessing, and controlling food safety hazards. 
  • Supply chain management: Addresses food safety throughout the entire supply chain. 
  • Regulatory compliance: Ensures compliance with food safety regulations and standards. 

ISO 45001: Occupational Health and Safety Management System

  • Risk assessment: Identifies and assesses occupational health and safety risks. 
  • Legal compliance: Ensures compliance with occupational health and safety legislation. 
  • Emergency preparedness: Develops and implements emergency procedures. 
  • Employee involvement: Encourages employee participation in health and safety initiatives. 

ISO standards incorporate several fundamental elements to ensure consistent performance and improvement. These include the Plan-Do-Check-Act (PDCA) cycle for continuous enhancement, robust risk management practices, comprehensive documentation, regular internal and external audits to assess effectiveness, as well as periodic management reviews to evaluate overall performance and identify areas for improvement. By adopting these standards, organisations can leverage this robust framework for managing their operations, ensuring consistent performance, which positions the organisation to be able to meet the evolving needs of customers and regulatory authorities.

The benefits of working with ISO-certified suppliers

Choosing ISO-certified suppliers can significantly enhance a business’s supply chain resilience. This choice gives companies the peace of mind that their new suppliers adhere to rigorous standards, which ensures improved product and service quality, as ISO certification guarantees consistent product or service quality, reducing the risk of defects or errors.

There is also a reduction in the risk of non-compliance, as ISO-certified suppliers have robust systems in place to manage compliance with regulatory requirements, mitigating legal and financial risks. Additionally, ISO accreditation promotes efficient operations and well-documented processes lead to smoother collaboration and predictable outcomes. In short, partnering with ISO-certified suppliers strengthens a company’s supply chain reputation, inspiring trust among customers.

The bottom line of ISO accreditation

While the initial costs of ISO accreditation may be substantial, the long-term benefits are undeniable. By investing in ISO certification, businesses can enhance their credibility, improve operational efficiency, mitigate risks, and gain a competitive edge. Just as important, working with ISO-certified suppliers strengthens a company’s supply chain, ensuring the delivery of high-quality products and services. This in turn leads to increased customer satisfaction, loyalty, and business growth. 

As such, ISO accreditation is not merely a compliance exercise; it is a strategic investment that empowers businesses to thrive in today’s challenging market. By understanding the value of different ISO standards and the advantages of working with ISO-certified suppliers, companies can make informed decisions to drive sustainable success.

Reconsidering Dialysis for Chronic Kidney Failure in the Elderly

Chronic kidney disease (CKD). Credit: Scientific Animations CC4.0

Whether dialysis is the best option for kidney failure and, if so, when to start, may deserve more careful consideration, according to a new study published in Annals of Internal Medicine.

For older adults who were not healthy enough for a kidney transplant, starting dialysis when their kidney function fell below a certain threshold, rather than waiting, afforded them roughly one more week of life, Stanford Medicine researchers and their colleagues found.

More critically, perhaps, they spent an average of two more weeks in hospitals or care facilities, in addition to the time spent undergoing dialysis.

“Is that really what a 75- or 80-year-old patient wants to be doing?” asked lead author Maria Montez Rath, PhD, a senior research engineer. Manjula Tamura, MD, a professor of nephrology, is the senior author.

“For all patients, but particularly for older adults, understanding the trade-offs is really essential,” Tamura said. “They and their physicians should carefully consider whether and when to proceed with dialysis.”

Patients with kidney failure who are healthy enough for transplantation may receive a donated kidney, which will rid their blood of toxins and excess fluid. But that option is unavailable to many older adults who have additional health conditions such as heart or lung disease or cancer.

For those patients, physicians often recommend dialysis when patients progress to kidney failure – when estimated glomerular filtration rate (eGFR), a measure of renal function, falls below 15.

Patients and their family members sometimes assume that dialysis is their only option, or that it will prolong life significantly, Montez Rath said. “They often say yes to dialysis, without really understanding what that means.”

But patients can take medications in lieu of dialysis to manage symptoms of kidney failure such as fluid retention, itchiness and nausea, Tamura said. She added that dialysis has side effects, such as cramping and fatigue, and typically requires a three- to four-hour visit to a clinic three times a week.

“It’s a pretty intensive therapy that entails a major lifestyle change,” she said.

Lifespan and time at home

The researchers conducted the study to quantify what dialysis entails for older adults who are ineligible for a transplant: whether and how much it prolongs life, along with the relative number of days spent in an inpatient facility such as a hospital, nursing home or rehabilitation center.

The team evaluated the health records, from 2010 to 2018, of 20 440 patients (98% of them men) from the U.S. Department of Veterans Affairs. The patients were 65 and older, had chronic kidney failure, were not undergoing evaluation for transplant and had an eGFR below 12.

Simulating a randomised clinical trial with electronic health records, they divided patients into groups: those who started dialysis immediately, and those who waited at least a month. Over three years, about half of the patients in the group who waited never started dialysis.

Patients who started dialysis immediately lived on average nine days longer than those who waited, but they spent 13 more in an inpatient facility. Age made a difference: Patients 65 to 79 who started dialysis immediately on average lived 17 fewer days while spending 14 more days in an inpatient facility; patients 80 and older who started dialysis immediately on average lived 60 more days but spent 13 more days in an inpatient facility.

Patients who never underwent dialysis on average died 77 days earlier than those who started dialysis immediately, but they spent 14 more days at home.

“The study shows us that if you start dialysis right away, you might survive longer, but you’re going to be spending a lot of time on dialysis, and you’re more likely to need hospitalization,” Montez Rath said.

Tamura noted that physicians sometimes recommend dialysis because they want to offer patients hope or because the downsides of the treatment haven’t always been clear. But the study indicates physicians and patients may want to wait until the eGFR drops further, Tamura said, and should consider symptoms along with personal preferences before starting dialysis.

“Different patients will have different goals,” she said. “For some it’s a blessing to have this option of dialysis, and for others it might be a burden.”

It may be helpful, she added, if clinicians portray dialysis for frail, older adults as a palliative treatment – primarily intended to alleviate symptoms.

“Currently, dialysis is often framed to patients as a choice between life and death,” she said. “When it’s presented in this way, patients don’t have room to consider whether the treatment aligns with their goals, and they tend to overestimate the benefits and well-being they might experience. But when treatment is framed as symptom-alleviating, patients can more readily understand that there are trade-offs.”

Source: Stanford Medicine

Exposure to Chronic Occupational Noise Drives up Blood Pressure

Photo by Emmanuel Ikwuegbe on Unsplash

Noise exposure is a known occupational hazard in some jobs, particularly for hearing loss, physical and psychological stress, and reduced concentration. A new study presented at the ACC Asia 2024 conference found in adult power loom weavers, chronic noise exposure not only increased their blood pressure overall, but also each year of exposure increased their odds of having high blood pressure by 10%.

“While the mechanism is still not well-explored, it is thought that the stress response by the body to chronic sound exposure causes hormonal imbalances that gradually leads to a permanent elevation of blood pressure,” said Golam Dastageer Prince, MBBS, MPH, medical officer at DGHS Bangladesh and the study’s lead author. “High blood pressure impacts more than a billion people worldwide and just 1 in 5 have it under control, yet it is a major cause of premature death. In addition to treating the high blood pressure through appropriate means, we must find ways to mitigate the exposure to the noise if we want to reduce the cardiovascular risk of these patients.”

Researchers at the Directorate of General Health Services in Bangladesh looked at 289 adult workers in selected weaving factories in the Araihazar sub-district of Narayanganj, Bangladesh, from January to December 2023. Participants took a face-to-face interview to complete a questionnaire covering sociodemographic variables, behaviour, dietary habits and family medical history. Blood pressure, height, weight and noise intensity were measured following standard procedures by the researchers.

The study cohort was predominantly male and married and were about 34 years of age on average. According to the researchers, a notable proportion of the cohort was illiterate. Workplace exposure duration averaged nearly 16 years, with noise intensity ranging from 96–111 decibels. In the United States the National Institute for Occupational Safety and Health has established the recommended exposure limits for occupational noise exposures to be 85 decibels on average over an eight-hour workday. Sounds at or below 70 decibels are generally considered safe.

According to Prince, none of the study population was found to be wearing ear protection personal protective equipment.

“Hopefully we can raise awareness of not only noise-induced hearing loss, but the impact of noise on blood pressure and workers’ behaviors and attitudes towards using personal protective equipment,” Prince said. “Pushing for structural improvements to industries may also help us improve the health safety of these workers.”

The study population had a 31.5% rate of high blood pressure with an additional 53.3% being prehypertensive. The study also found a positive correlation between blood pressure and noise exposure duration. Each year of exposure was found to increase high blood pressure odds by 10%, even after adjusting for age, body mass index and smoking status.

“As the study focused on workers exposed to more than 85 decibels noise for long periods of time, any profession causing workers to experience similar exposure might experience similar blood pressure impacts,” Prince said. “We definitely need more exploratory studies to reveal more information about the potential mechanisms and long-term health outcomes.”

Recent studies have shown that living near noise pollution, including highways, trains and air traffic, can have an impact on cardiovascular health. However, the current study may not apply to noise experienced during daily life. Noise pollution experienced near home typically ebbs and flows, while the industrial exposures in the study are typically continuous in pattern due to the machinery and remain at a constant sound level, according to Prince.

Source: American College of Cardiology

Taming Parkinson’s Disease with Adaptive Deep Brain Stimulation

Deep brain stimulation illustration. Credit: NIH

Two new studies from UC San Francisco are pointing the way toward round-the-clock personalised care for people with Parkinson’s disease through an implanted device that can treat movement problems during the day and insomnia at night. 

The approach, called adaptive deep brain stimulation, or aDBS, uses methods derived from AI to monitor a patient’s brain activity for changes in symptoms. 

When it spots them, it intervenes with precisely calibrated pulses of electricity. The therapy complements the medications that Parkinson’s patients take to manage their symptoms, giving less stimulation when the drug is active, to ward off excess movements, and more stimulation as the drug wears off, to prevent stiffness.

It is the first time a so-called “closed loop” brain implant technology has been shown to work in Parkinson’s patients as they go about their daily lives. The device picks up brain signals to create a continuous feedback mechanism that can curtail symptoms as they arise. Users can switch out of the adaptive mode or turn the treatment off entirely with a hand-held device.

For the first study, researchers conducted a clinical trial with four people to test how well the approach worked during the day, comparing it to an earlier brain implant DBS technology known as constant or cDBS. 

To ensure the treatment provided the maximum relief to each participant, the researchers asked them to identify their most bothersome symptom. The new technology reduced them by 50%. Results appear August 19 in Nature Medicine.

“This is the future of deep brain stimulation for Parkinson’s disease,” said senior author Philip Starr, MD, PhD, the Dolores Cakebread Professor of Neurological Surgery, co-director of the UCSF Movement Disorders and Neuromodulation Clinic

Starr has been laying the groundwork for this technology for more than a decade. In 2013, he developed a way to detect and then record the abnormal brain rhythms associated with Parkinson’s. In 2021, his team identified specific patterns in those brain rhythms that correspond to motor symptoms.

“There’s been a great deal of interest in improving DBS therapy by making it adaptive and self-regulating, but it’s only been recently that the right tools and methods have been available to allow people to use this long-term in their homes,” said Starr, who was recruited by UCSF in 1998 to start its DBS program.

Earlier this year, UCSF researchers led by Simon Little, MBBS, PhD, demonstrated in Nature Communications that adaptive DBS has the potential to alleviate the insomnia that plagues many patients with Parkinson’s. 

“The big shift we’ve made with adaptive DBS is that we’re able to detect, in real time, where a patient is on the symptom spectrum and match it with the exact amount of stimulation they need,” said Little, associate professor of neurology and a senior author of both studies. Both Little and Starr are members of the UCSF Weill Institute for Neurosciences.

Restoring movement

Parkinson’s disease affects about 10 million people around the world. It arises from the loss of dopamine-producing neurons in deep regions of the brain that are responsible for controlling movement. The lack of those cells can also cause non-motor symptoms, affecting mood, motivation and sleep.

Treatment usually begins with levodopa, a drug that replaces the dopamine these cells are no longer able to make. However, excess dopamine in the brain as the drug takes effect can cause uncontrolled movements, called dyskinesia. As the medication wears off, tremor and stiffness set in again.  

Some patients then opt to have a standard cDBS device implanted, which provides a constant level of electrical stimulation. Constant DBS may reduce the amount of medication needed and partially reduce swings in symptoms. But the device also can over- or undercompensate, causing symptoms to veer from one extreme to the other during the day.

Closing the loop

To develop a DBS system that could adapt to a person’s changing dopamine levels, Starr and Little needed to make the DBS capable of recognising the brain signals that accompany different symptoms. 

Previous research had identified patterns of brain activity related to those symptoms in the subthalamic nucleus, or STN, the deep brain region that coordinates movement. This is the same area that cDBS stimulates, and Starr suspected that stimulation would mute the signals they needed to pick up.

So, he found alternative signals elsewhere in the brain – the motor cortex – that wouldn’t be weakened by the DBS stimulation. 

The next challenge was to work out how to develop a system that could use these dynamic signals to control DBS in an environment outside the lab. 

Building on findings from adaptive DBS studies that he had run at Oxford University a decade earlier, Little worked with Starr and the team to develop an approach for detecting these highly variable signals across different medication and stimulation levels.  

Over the course of many months, postdoctoral scholars Carina Oerhn, PhD, Lauren Hammer, PhD, and Stephanie Cernera, PhD, created a data analysis pipeline that could turn all of this into personalised algorithms to record, analyse and respond to the unique brain activity associated with each patient’s symptom state.

John Ngai, PhD, who directs the Brain Research Through Advancing Innovative Neurotechnologies® initiative (The BRAIN Initiative®) at the National Institutes of Health, said the study promises a marked improvement over current Parkinson’s treatment. 

“This personalised, adaptive DBS embodies The BRAIN Initiative’s core mission to revolutionise our understanding of the human brain,” he said. 

A better night’s sleep

Continuous DBS is aimed at mitigating daytime movement symptoms and doesn’t usually alleviate insomnia.

But in the last decade, there has been a growing recognition of the impact that insomnia, mood disorders and memory problems have on Parkinson’s patients. 

To help fill that gap, Little conducted a separate trial that included four patients with Parkinson’s and one patient with dystonia, a related movement disorder. In their paper published in Nature Communications, first author Fahim Anjum, PhD, a postdoctoral scholar in the Department of Neurology at UCSF, demonstrated that the device could recognise brain activity associated with various states of sleep. He also showed it could recognise other patterns that indicate a person is likely to wake up in the middle of the night. 

Little and Starr’s research teams, including their graduate student Clay Smyth, have started testing new algorithms to help people sleep. Their first sleep aDBS study was published last year in Brain Stimulation.  

Scientists are now developing similar closed-loop DBS treatments for a range of neurological disorders. 

“We see that it has a profound impact on patients, with potential not just in Parkinson’s but probably for psychiatric conditions like depression and obsessive-compulsive disorder as well,” Starr said. “We’re at the beginning of a new era of neurostimulation therapies.”

Source: University of California San Francisco