Year: 2024

How Human Brain Functional Networks Emerge and Develop during the Birth Transition

Shedding light on the growth trajectory of global functional neural networks before and after birth

Photo by Christian Bowen on Unsplash

Brain-imaging data collected from foetuses and infants has revealed a rapid surge in functional connectivity between brain regions on a global scale at birth, possibly reflecting neural processes that support the brain’s ability to adapt to the external world, according to a study published November 19th, in the open-access journal PLOS Biology led by Lanxin Ji and Moriah Thomason from the New York University School of Medicine, USA.

Understanding the sequence and timing of brain functional network development at the beginning of human life is critical. Yet many questions remain regarding how human brain functional networks emerge and develop during the birth transition. To fill this knowledge gap, Thomason and colleagues leveraged a large functional magnetic resonance imaging dataset to model developmental trajectories of brain functional networks spanning 25 to 55 weeks of post-conceptual gestational age. The final sample included 126 foetal scans and 58 infant scans from 140 subjects.

The researchers observed distinct growth patterns in different regions, showing that neural changes accompanying the birth transition are not uniform across the brain. Some areas exhibited minimal changes in resting-state functional connectivity (RSFC) – correlations between blood oxygen level-dependent signals between brain regions when no explicit task is being performed. But other areas showed dramatic changes in RSFC at birth. The subcortical network, sensorimotor network, and superior frontal network stand out as regions that undergo rapid reorganisation during this developmental stage.

Additional analysis highlighted the subcortical network as the only region that exhibited a significant increase in communication efficiency within neighbouring nodes. The subcortical network represents a central hub, relaying nearly all incoming and outgoing information to and from the cortex and mediating communication between cortical areas. On the other hand, there was a gradual increase in global efficiency in sensorimotor and parietal-frontal regions throughout the foetal to neonatal period, possibly reflecting the establishment or strengthening of connections as well as the elimination of redundant connections.

According to the authors, this work unveils fundamental aspects of early brain development and lays the foundation for future research on the influence of environmental factors on this process. In particular, further studies could reveal how factors such as sex, prematurity, and prenatal adversity interact with the timing and growth patterns of children’s brain network development.

The authors add, “This study for the first time documents the significant change of brain functional networks over the birth transition. We observe that growth patterns are regionally specific, with some areas of the functional connectome showing minimal changes, while others exhibit a dramatic increase at birth.”

Provided by PLOS

Genetically Tailored Diets for IBS may Soon be Possible

Irritable bowel syndrome. Credit: Scientific Animations CC4.0

An international study has found that genetic variations in human carbohydrate-active enzymes may affect how people with irritable bowel syndrome (IBS) respond to a carbohydrate-reduced diet.

The research, which is published in Clinical Gastroenterology & Hepatologyshows that IBS patients with genetic defects in carbohydrate digestion had a better response to certain dietary interventions. This could lead to tailored treatments for IBS, using genetic markers to predict which patients benefit from specific diets.

Irritable bowel syndrome (IBS) is a digestive disorder affecting up to 10% of the global population. It is characterised by abdominal pain, bloating, diarrhoea, or constipation. Despite its prevalence, treating IBS remains a challenge as symptoms and responses to dietary or pharmacological interventions vary significantly.

Patients often connect their symptoms to eating certain foods, especially carbohydrates, and dietary elimination or reduction has emerged as an effective treatment option, though not all patients experience the same benefits.

Nutrigenetics (the science investigating the combined action of our genes and nutrition on human health) has highlighted how changes in the DNA can affect the way we process food. A well-known example is lactose intolerance, where the loss of function in the lactase enzyme hinders the digestion of dairy products.

Now, this pioneering new study suggests that genetic variations in human carbohydrate-active enzymes (hCAZymes) may similarly affect how IBS patients respond to a carbohydrate-reduced (low-FODMAP) diet.

The team have now revealed that individuals with hypomorphic (defective) variants in hCAZyme genes are more likely to benefit from a carbohydrate-reduced diet.

The study, involving 250 IBS patients, compared two treatments: a diet low in fermentable carbohydrates (FODMAPs) and the antispasmodic medication otilonium bromide. Strikingly, of the 196 patients on the diet, those carrying defective hCAZyme genes showed marked improvement compared to non-carriers, and the effect was particularly pronounced in patients with diarrhoea-predominant IBS (IBS-D), who were six times more likely to respond to the diet. In contrast, this difference was not observed in patients receiving medication, underscoring the specificity of genetic predisposition in dietary treatment efficacy.

These findings suggest that genetic variations in hCAZyme enzymes, which play a key role in digesting carbohydrates, could become critical markers for designing personalised dietary treatments for IBS. The ability to predict which patients respond best to a carbohydrate-reduced diet has the potential to strongly impact IBS management, leading to better adherence and improved outcomes.

Study leader Dr D’Amato, Gastrointestinal Genetics Research group at CIC bioGUNE and the Department of Medicine and Surgery at LUM University in in Italy.

In the future, incorporating knowledge of hCAZyme genotype into clinical practice could enable clinicians to identify in advance which patients are most likely to benefit from specific dietary interventions. This would not only avoid unnecessary restrictive diets for those unlikely to benefit but also open the door to personalised medicine in IBS.

Source: University of Nottingham

New Minimally Invasive Neural Interface is a Revolutionary Development

The researchers’ experiments showed that the catheter electrodes could be successfully delivered and guided into the ventricular spaces and brain surface for electrical stimulation. Image courtesy of Rice University.

A team of researchers has developed a technique for diagnosing, managing and treating neurological disorders with minimal surgical risks. The team’s findings were published in Nature Biomedical Engineering.

While traditional approaches for interfacing with the nervous system often require creating a hole in the skull to interface with the brain, the researchers have developed an innovative method known as endocisternal interfaces (ECI), allowing for electrical recording and stimulation of neural structures, including the brain and spinal cord, through cerebral spinal fluid (CSF).

“Using ECI, we can access multiple brain and spinal cord structures simultaneously without ever opening up the skull, reducing the risk of complications associated with traditional surgical techniques,” said study leader Robinson Jacob Robinson, professor of electrical and computer engineering and bioengineering at Rice University.

ECI uses CSF, which surrounds the nervous system, as a pathway to deliver targeted devices. By performing a simple lumbar puncture in the lower back, researchers can navigate a flexible catheter to access the brain and spinal cord.

Using miniature magnetoelectric-powered bioelectronics, the entire wireless system can be deployed through a small percutaneous procedure. The flexible catheter electrodes can be navigated freely from the spinal subarachnoid space to the brain ventricles.

“This is the first reported technique that enables a neural interface to simultaneously access the brain and spinal cord through a simple and minimally invasive lumbar puncture,” said University of Texas Medical Branch’s Peter Kan, professor and Chair of Neurosurgery, who also led the study. “It introduces new possibilities for therapies in stroke rehabilitation, epilepsy monitoring and other neurological applications.”

To test the hypothesis, the research team characterised the endocisternal space and measured the width of the subarachnoid, or fluid-filled space, in human patients using MRI. The researchers then conducted experiments in large animal models, specifically sheep, to validate the feasibility of the new neural interface.

Their experiments showed that the catheter electrodes could be successfully delivered and guided into the ventricular spaces and brain surface for electrical stimulation. By using the magnetoelectric implant, the researchers were able to record electrophysiologic signals such as muscle activation and spinal cord potentials.

Preliminary safety results showed that the ECI remained functional with minimal damage up to 30 days after the electronic device was implanted chronically into the brain.

Moreover, the study revealed that unlike endovascular neural interfaces that require antithrombotic medication and are limited by the small size and location of blood vessels, ECI offers broader access to neural targets without the medication.

“This technology creates a new paradigm for minimally invasive neural interfaces and could lower the risk of implantable neurotechnologies, enabling access to wider patient populations,” said Josh Chen, lead author of the study.

Source: Rice University

Adding Vitamin C to Chemotherapy Doubles Pancreatic Cancer Survival Time

Pancreatic cancer. Credit: Scientific Animations CC BY-SA 4.0

Results from a randomised, phase 2 clinical trial show that adding high-dose, intravenous (IV) vitamin C to chemotherapy doubles the overall survival of patients with late-stage metastatic pancreatic cancer from eight months to 16 months. 

“This is a deadly disease with very poor outcomes for patients. The median survival is eight months with treatment, probably less without treatment, and the five-year survival is tiny,” says Joe Cullen, MD, University of Iowa professor of surgery, and radiation oncology, and senior author of the study. “When we started the trial, we thought it would be a success if we got to 12 months survival, but we doubled overall survival to 16 months. The results were so strong in showing the benefit of this therapy for patient survival that we were able to stop the trial early.” 

The findings, published in Redox Biology, mark another success for high-dose, intravenous vitamin C, which has overcome many hurdles in the almost 20 years UI researchers have persevered to demonstrate its benefit for cancer patients. 

“We’ve had ups and downs of course, but this is a culmination of a lot of people’s hard work,” says Cullen who also is a member of UI Health Care Holden Comprehensive Cancer Center. “It’s really a positive thing for patients and for the University of Iowa.”

Increased survival, improved quality of life

In the study, 34 patients with stage 4 metastatic pancreatic cancer were randomized to receive either standard chemotherapy (gemcitabine and nab-paclitaxel), or the chemotherapy plus infusions of high-dose vitamin C. The results showed that average overall survival was 16 months for the patients receiving the chemotherapy plus vitamin C, compared to eight months for the patients getting just chemotherapy. In addition, progression free survival was extended from four months to six months. 

“Not only does it increase overall survival, but the patients seem to feel better with the treatment,” Cullen says. “They have less side effects, and appear to be able to tolerate more treatment, and we’ve seen that in other trials, too.” 

The new study is not the only evidence of the benefit of including IV vitamin C as part of cancer treatment. Earlier this year, the results of another UI phase 2 clinical trial in patients with glioblastoma, a deadly form of brain cancer, were published. That study also showed a significant increase in survival when high-dose, IV vitamin C was added to standard of care chemotherapy and radiation. Cullen was also part of that trial along with his colleague Bryan Allen, MD, PhD, UI professor and head of radiation oncology. 

A third phase 2 trial in non-small cell lung cancer is still underway, with results expected within the year. All three trials were funded by a 2018 grant from the National Cancer Institute (NCI)

“This NCI funding was incredibly important for us to conduct these phase 2 trials and obtain these really encouraging results. Our aim is to show that adding high-dose, IV vitamin C, which is very inexpensive and very well tolerated, can improve treatment for these cancers that are among the deadliest affecting the U.S. population,” Cullen adds. 

A long journey to clinical trials

Cullen, Allen, and their colleagues at UI Health Care have been researching the anti-cancer effect of high-dose, IV vitamin C for decades. Their work revealed a critical difference between intravenous and oral vitamin C. Intravenous vitamin C administration produces very high levels in the blood, which cannot be achieved with oral delivery. These high concentrations result in unique chemical reactions within cancer cells that render the cell more vulnerable to chemo- and radiation therapies. 

Cullen notes that despite scepticism towards vitamin C as a cancer therapy, the results he and his colleagues have obtained, from basic science findings to understand the biological mechanisms at work, through the various clinical trials, have been highly encouraging and robust. 

“Through every step of the process, it continued to improve. We did it in cells, it worked great. We did it in mice, it worked great. Then our phase one trials looked very promising. So, the progression has just been phenomenal, really,” Cullen says. “For example, in one of our phase 1 trials for pancreatic cancer, where we combine high-dose, IV vitamin C with radiation, we still have three long-term survivors. They’re out nine years at this point, which is far beyond the typical survival range.” 

Source: University of Iowa Health Care

A Multiple Sclerosis Drug may Help with Poor Working Memory

This is a pseudo-coloured image of high-resolution gradient-echo MRI scan of a fixed cerebral hemisphere from a person with multiple sclerosis. Credit: Govind Bhagavatheeshwaran, Daniel Reich, National Institute of Neurological Disorders and Stroke, National Institutes of Health

Fampridine is currently used to improve walking ability in multiple sclerosis. A new study shows that it could also help individuals with reduced working memory, as seen in mental health conditions like schizophrenia or depression.

Working memory allows a memory to be actively retained for a few seconds, for cognitive tasks such as remembering an email address to save it, or participating in a conversation. Certain conditions, such as schizophrenia or depression, as well as ADHD, impair working memory. Those affected lose track in conversations and struggle to organise their thoughts.

Fampridine is a drug that could help in such cases, as shown in a study led by Professors Andreas Papassotiropoulos and Dominique de Quervain at the University of Basel. The team has reported their findings in the journal Molecular Psychiatry.

Effective only if working memory is poor

In their study, the researchers tested the effectiveness of fampridine on working memory in 43 healthy adults. It was in those participants whose baseline working memory was at a low level that fampridine showed a more pronounced effect: after taking the active substance for three days, they scored better in the relevant tests than those who took the placebo. In contrast, in people who already had good baseline working memory, the drug showed no effect.

The researchers also observed that fampridine increased brain excitability in all participants, thus enabling faster processing of stimuli. The study was randomized and double-blind.

Established drug, new application

“Fampridine doesn’t improve working memory in everyone. But it could be a treatment option for those with reduced working memory,” explains Andreas Papassotiropoulos. Dominique de Quervain adds: “That’s why, together with researchers from the University Psychiatric Clinics Basel (UPK), we’re planning studies to test the efficacy of fampridine in schizophrenia and depression.”

The drug is currently used to improve walking ability in multiple sclerosis (MS). Particularly in capsule form, which releases the active ingredient slowly in the body, fampridine has shown effects on cognitive performance in MS patients: for some, it alleviates the mental fatigue that can accompany MS.

The researchers did not select the drug at random: this study followed comprehensive analyses of genome data in order to find starting points for repurposing established drugs. Fampridine acts on specific ion channels in nerve cells that, according to the researchers’ analyses, also play a role in mental disorders such as schizophrenia.

Source: University of Basel

A Link between Heart Shape and Cardiovascular Disease Risk

Researchers from Queen Mary University of London and other universities have for the first time examined the genetic basis of the heart’s left and right ventricles using advanced 3D imaging and machine learning.

Prior research primarily focused on the heart’s size and volume and specific chambers. By studying both ventricles together, the team was able to capture the more intricate, multi-dimensional aspects of the heart shape.

This new approach of exploring shape has led to the discovery of new heart-associated genes and provided a better understanding of the biological pathways linking heart shape to cardiovascular disease.

Cardiovascular disease is among the leading causes of death in the UK and globally. The findings of this study could change how cardiac disease risk is evaluated. Genetic information related to heart shape can provide a risk score for heart disease, offering potentially early and more tailored assessment in clinical settings.

“This study provides new information on how we think about heart disease risk,” said Patricia B. Munroe, Professor of Molecular Medicine at Queen Mary and co-author of the study. “We’ve long known that size and volume of the heart matter, but by examining shape, we’re uncovering new insights into genetic risks. This discovery could provide valuable additional tools for clinicians to predict disease earlier and with more precision.”

The team used cardiovascular MRI images from over 40 000 individuals from the UK Biobank to create 3D models of the ventricles. Through statistical analysis, they identified 11 shape dimensions that describe the primary variations in heart shape.

Subsequent genetic analysis found 45 specific areas in the human genome linked to different heart shapes. Fourteen of these areas had not been previously known to influence heart traits.

“This study sets an important foundation for the exploration of genetics in both ventricles”, said Dr Richard Burns, Statistical Geneticist at Queen Mary. “The study confirms that combined cardiac shape is influenced by genetics, and demonstrates the usefulness of cardiac shape analysis in both ventricles for predicting individual risk of cardiometabolic diseases alongside established clinical measures.”

This research marks an exciting new chapter in understanding how genetics influence the heart and opens the doors to further studies on how these findings could be integrated into clinical practice, ultimately benefiting millions at risk of heart disease.

Source: Queen Mary University of London

Dr Jessica Voerman Highlights Key Healthcare Trends to Watch for in 2025 

Source: Pixabay CC0

The healthcare landscape is rapidly evolving, and 2025 is poised to bring significant changes driven by technological advancements and shifting patient needs. As the sector faces ongoing challenges such as rising costs, limited access, and increasing demand for mental health services, innovative solutions will be key to addressing these issues. From the rise of virtual healthcare and wearable technologies to the growing influence of artificial intelligence, these trends are reshaping how care is delivered and experienced.

“The healthcare sector must embrace innovation to address challenges like affordability and accessibility while leveraging technologies such as AI, virtual healthcare, and wearables to reshape how we deliver care,” said Dr Jessica Voerman, Chief Clinical Officer at SH Inc. Healthcare.

KEY TRENDS POISED TO DEFINE HEALTHCARE IN 2025

  1. RISING HEALTHCARE COSTS AND ACCESS CHALLENGES
    As we approach 2025, the escalation of healthcare costs is expected to persist, with medical aid contributions outpacing inflation and the general expense of healthcare services becoming increasingly burdensome. This growing financial pressure is placing significant strain not only on patients, but also on healthcare providers and the broader healthcare system. In response, identifying and implementing innovative solutions to alleviate this looming financial crisis remains a critical priority for healthcare businesses nationwide. For many South Africans, the rising cost of healthcare is exacerbating issues of accessibility and affordability, with an increasing number of individuals unable to access necessary medical care. In light of this, we anticipate a strong focus on policy reform aimed at addressing these inequalities. As such, addressing healthcare disparities will continue to be a central theme in the ongoing development of healthcare policies and initiatives in the coming years. 
  2. INCREASING DEMAND FOR MENTAL HEALTHCARE SERVICES
    One of the most prominent shifts anticipated in the healthcare landscape by 2025 is the significant rise in demand for mental healthcare services. The recognition that mental health is integral to overall well-being has led to a growing push to integrate mental health services into primary healthcare systems. Such integration is proving to be both preventative and curative, as early intervention can improve long-term outcomes. Furthermore, mental healthcare is particularly well-suited for the adoption of digital health tools, such as virtual consultations, which can enhance access to care, particularly in underserved or rural areas. The increased focus on mental health will likely continue to drive growth in this sector, as more individuals seek professional support to manage mental health challenges. 
  3. EXPANSION OF VIRTUAL HEALTHCARE
    The trend towards virtual healthcare is expected to continue its upward trajectory in 2025, as more patients turn to telemedicine as either a primary or supplementary means of accessing healthcare services. According to a McKinsey report, telemedicine is projected to account for more than 20% of outpatient consultations by 2025. This shift is expected to be particularly pronounced in areas such as primary healthcare, chronic disease management, dermatology, and mental healthcare. Virtual consultations offer patients the convenience of receiving care remotely, which can help to reduce barriers related to distance, time, and accessibility. For healthcare providers, virtual healthcare offers opportunities to streamline services, increase operational efficiency, and reach a broader patient population. 
  4. THE ROLE OF WEARABLES AND HEALTH DATA COLLECTION
    Wearable health technologies, including biosensors capable of monitoring, transmitting, and analysing vital signs, represent another exciting frontier in digital health. These devices have the potential to revolutionise the management of both acute and chronic conditions by providing continuous, real-time data that can inform clinical decision-making. With their ability to track everything from heart rate and blood glucose levels to oxygen saturation and sleep patterns, wearables offer unprecedented insights into an individual’s health status. This wealth of data has the potential to improve patient outcomes, empower individuals to take a more proactive role in managing their health, and help healthcare providers tailor interventions more precisely. As these technologies evolve, they will become an increasingly important tool in both disease prevention and management. 
  5. THE GROWING IMPACT OF ARTIFICIAL INTELLIGENCE (AI)
    Artificial intelligence (AI) continues to make significant strides in healthcare, particularly in areas such as clinical decision-making, diagnostics, and operational efficiency. AI algorithms have demonstrated their ability to improve the speed, accuracy, and reliability of diagnoses, enabling healthcare professionals to make more informed decisions. Furthermore, AI-driven tools are improving clinical workflows, optimizing resource allocation, and enhancing the overall patient experience. In the realm of surgery, robotic-assisted technologies are increasingly being used to improve the precision of procedures, reduce the risk of human error, and shorten recovery times for patients. Additionally, the use of virtual and augmented reality technologies in medical training and physical rehabilitation is gaining traction, offering immersive, interactive experiences that improve learning outcomes and accelerate recovery for patients.

Looking ahead to 2025, healthcare is set to evolve rapidly, driven by technological advancements and growing demand for accessible, affordable care. Key trends such as rising costs, expanded mental health access, virtual healthcare, wearable technologies, and artificial intelligence are reshaping the sector.

For businesses and policymakers, staying ahead of these changes is crucial to ensuring sustainable, equitable, and effective care. By embracing digital tools, AI, and data-driven solutions, the healthcare system can improve both patient outcomes and overall efficiency. Collaboration and innovation across all sectors will be essential to meeting the evolving needs of patients and society.

Resistance to Artemisinin Found in African Children with Severe Malaria

Photo by Ekamelev on Unsplash

Indiana University School of Medicine researchers, in collaboration with colleagues at Makerere University in Uganda, have uncovered evidence of partial resistance to artemisinin derivatives – the primary treatment for malaria – in young children with severe, or “complicated,” malaria. 

Earlier studies have shown partial resistance to artemisinin in children with uncomplicated malaria, but the new study, published in the Journal of the American Medical Association (JAMA), is the first to document such resistance in African children with well-defined signs of severe disease from malaria. 

“Artemisinin-based therapies have been quintessential in the fight against malaria for the past 20 years,” said corresponding author Chandy C. John, MD, the professor of paediatrics at the IU School of Medicine. “Growing evidence of artemisinin partial resistance in African children with uncomplicated malaria has led to concerns that new therapies, like triple artemisinin combination therapies, may be needed in uncomplicated malaria. The findings of artemisinin partial resistance in children with severe or complicated malaria, as well as the findings of a high rate of recurrent malaria with current standard treatment in these areas raise the question of whether new treatments are needed for severe malaria as well.”

Led by John and co-authors Ruth Namazzi, MBChB, MMEd, and Robert Opoka, MD, MPH, of Makerere University; Ryan Henrici, MD, PhD, of the University of Pennsylvania; and Colin Sutherland, PhD, MPH, of the London School of Tropical Medicine and Hygiene, the study examined 100 Ugandan children aged 6 months to 12 years who were undergoing treatment for severe malaria complications caused by Plasmodium falciparum, the deadly malaria parasite transmitted by mosquitos. 

In the study, 10 children had parasites with genetic mutations previously associated with artemisinin partial resistance. The most common mutation, which was seen in eight of these children, was associated with a longer parasite clearance half-life — the time it takes the parasite’s burden in the body to reach half of its initial level. The study also showed that 10% of children returned within 28 days of treatment with an infection from the same malaria strain they had during their original admission. These were all children who had received complete intravenous and then oral treatment for severe malaria, and all had cleared the parasite by microscopic examination. John said these findings suggest that the standard intravenous and oral treatment lowers the parasite level to where it cannot be detected by microscopy, but it does not completely eliminate the parasite in some children.   

Reports of artemisinin resistance first surfaced in Southeast Asia in 2008 before emerging in East Africa, a trend the IU research team unexpectedly observed through their ongoing work in Uganda. While studying why severe malaria develops in children, the researchers noticed slower responses to artemisinin in some of their Ugandan study participants, prompting the present study. 

“The study findings point to a need for more data on artemisinin resistance and recurrence of clinical malaria in children with severe malaria,” John said. “If our study findings are confirmed in other areas, that would suggest that treatment guidelines for severe malaria may require revision.”  

John presented the study’s results at the Annual Meeting of the American Society of Tropical Medicine and Hygiene on Nov. 14 in New Orleans, Louisiana.

Source: Indiana University

Antiseizure Drugs during Pregnancy may Affect Neurodevelopment

Photo by SHVETS production

Children whose mothers have taken antiseizure drugs during pregnancy are more likely than others to receive a neuropsychiatric diagnosis. This is according to a comprehensive study by researchers at Karolinska Institutet and elsewhere, published in Nature Communications. However, the researchers emphasise that the absolute risk is low.

Antiseizure drugs are used to treat epilepsy and to stabilise mood in certain psychiatric conditions. However, some of these drugs, such as valproate, are known to affect the foetus if used during pregnancy. 

The current study included data from over three million children in the UK and Sweden, 17 495 of whom had been exposed to antiseizure drugs during pregnancy. 

As expected, children exposed to valproate were more likely to be diagnosed with autism, intellectual disability or ADHD compared to children not exposed to antiseizure drugs. Children exposed to topiramate had a 2.5-fold increased risk of intellectual disability, while those exposed to carbamazepine had a 25 per cent increased risk of being diagnosed with autism and a 30 per cent increased risk of intellectual disability. 

No increased risk with lamotrigine 

However, the researchers found no evidence that taking the antiseizure drug lamotrigine during pregnancy increases the risk of neuropsychiatric diagnoses in the child. 

“Our findings suggest that while certain medications may pose some risk, lamotrigine may be a less risky option, but active monitoring of any antiseizure medication is critical to ensure safety and effectiveness, particularly during pregnancy,” says Brian K. Lee, Professor at Drexel University Dornsife School of Public Health, USA, and affiliated researcher at the Department of Global Public Health, Karolinska Institutet, Sweden. 

The researchers emphasise that the absolute risk of the child receiving a neuropsychiatric diagnosis is low and that there may also be risks associated with not taking antiseizure medication during pregnancy. 

“If you’re pregnant or trying to become pregnant, and taking one of these medications, it may be worth talking with your physician to make sure you’re taking the best medicine for your needs, while minimising risk to future children,” says Viktor H. Ahlqvist, researcher at the Institute of Environmental Medicine, Karolinska Institutet, and joint first author with Paul Madley-Dowd at the University of Bristol, UK. 

The results support previous findings from smaller studies that found links between antiseizure drugs during pregnancy and the risk of neuropsychiatric diagnoses in the child. One difference is that the new study found no statistically significant association between topiramate or levetiracetam and ADHD in the child. 

Source: Karolinska Institutet

Telltale Chemical in the Breath can Warn of Lung Cancer

Credit: Scientific Animations CC4.0

Exhaled breath contains chemical clues to what’s going on inside the body, including diseases like lung cancer. And devising ways to sense these compounds could help doctors provide early diagnoses — and improve patients’ prospects. In a study in ACS Sensors, researchers report the development of ultrasensitive, nanoscale sensors that in small-scale tests distinguished a key change in the chemistry of the breath of people with lung cancer.

Besides carbon dioxide, people also exhale other airborne compounds. Researchers have determined that declines in one exhaled chemical — isoprene — can indicate the presence of lung cancer. However, to detect such small shifts, a sensor would need to be highly sensitive, capable of detecting isoprene levels in the parts-per-billion (ppb) range. It would also need to differentiate isoprene from other volatile chemicals and withstand breath’s natural humidity. Previous attempts to engineer gas sensors with characteristics like these have focused on metal oxides, including one particularly promising compound made with indium oxide. A team led by Pingwei Liu and Qingyue Wangset out to refine indium oxide-based sensors to detect isoprene at the level at which it naturally occurs in breath.

The researchers developed a series of indium(III) oxide (In2O3)-based nanoflake sensors. In experiments, they found one type, which they called Pt@InNiOx for the platinum (Pt), indium (In) and nickel (Ni) it contains, performed best. These Pt@InNiOx sensors:

  • Detected isoprene levels as low as 2ppb, a sensitivity that far surpassed earlier sensors.
  • Responded to isoprene more than other volatile compounds commonly found in breath.
  • Performed consistently during nine simulated uses.

More importantly, the authors’ real-time analysis of the nanoflakes’ structure and electrochemical properties revealed that Pt nanoclusters uniformly anchored on the nanoflakes catalyzed the activation of isoprene sensing, leading to the ultrasensitive performance.

Finally, to showcase the potential medical use of these sensors, the researchers incorporated the Pt@InNiOnanoflakes into a portable sensing device. Into this device they introduced breath collected earlier from 13 people, five of whom had lung cancer. The device detected isoprene levels lower than 40 ppb in samples from participants with cancer and more than 60 ppb from cancer-free participants. This sensing technology could provide a breakthrough in non-invasive lung cancer screening and has the potential to improve outcomes and even save lives, the researchers say.

Source: American Chemical Society