Category: Neurology

‘Healthy’ Vitamin B12 Levels not Enough to Ward off Neuro Decline

Created with Gencraft. CC4.0

Meeting the minimum requirement for vitamin B12, needed to make DNA, red blood cells and nerve tissue, may not actually be enough – particularly if for older adults. It may even put them at risk for cognitive impairment, according to a study published in Annals of Neurology.

The research found that older, healthy volunteers, with lower concentrations of B12, but still in the normal range, showed signs of neurological and cognitive deficiency. These levels were associated with more damage to the brain’s white matter – the nerve fibres that enable communication between areas of the brain – and test scores associated with slower cognitive and visual processing speeds, compared to those with higher B12.

The UC San Francisco researchers, led by senior author Ari J. Green, MD, of the Departments of Neurology and Ophthalmology and the Weill Institute for Neurosciences, said that the results raise questions about current B12 requirements and suggest the recommendations need updating.

“Previous studies that defined healthy amounts of B12 may have missed subtle functional manifestations of high or low levels that can affect people without causing overt symptoms,” said Green, noting that clear deficiencies of the vitamin are commonly associated with a type of anaemia. “Revisiting the definition of B12 deficiency to incorporate functional biomarkers could lead to earlier intervention and prevention of cognitive decline.”

Lower B12 correlates with slower processing speeds, brain lesions

In the study, researchers enrolled 231 healthy participants without dementia or mild cognitive impairment, whose average age was 71. They were recruited through the Brain Aging Network for Cognitive Health (BrANCH) study at UCSF.

Their blood B12 amounts averaged 414.8pmol/L, well above the U.S. minimum of 148pmol/L. Adjusted for factors like age, sex, education and cardiovascular risks, researchers looked at the biologically active component of B12, which provides a more accurate measure of the amount of the vitamin that the body can utilize. In cognitive testing, participants with lower active B12 were found to have slower processing speed, relating to subtle cognitive decline. Its impact was amplified by older age. They also showed significant delays responding to visual stimuli, indicating slower visual processing speeds and general slower brain conductivity.

MRIs revealed a higher volume of lesions in the participants’ white matter, which may be associated with cognitive decline, dementia or stroke.

While the study volunteers were older adults, who may have a specific vulnerability to lower levels of B12, co-first author Alexandra Beaudry-Richard, MSc, said that these lower levels could “impact cognition to a greater extent than what we previously thought, and may affect a much larger proportion of the population than we realize.” Beaudry-Richard is currently completing her doctorate in research and medicine at the UCSF Department of Neurology and the Department of Microbiology and Immunology at the University of Ottawa.

“In addition to redefining B12 deficiency, clinicians should consider supplementation in older patients with neurological symptoms even if their levels are within normal limits,” she said. “Ultimately, we need to invest in more research about the underlying biology of B12 insufficiency, since it may be a preventable cause of cognitive decline.”

Source: University of California – San Francisco

Scientists Discover Brain Mechanism that Helps Override Fear

Coronal brain slice showing projections from different visual areas in the cerebral cortex to the ventrolateral geniculate nucleus (vLGN). These pathways are part of the circuit identified as mediating the suppression of instinctive fear responses.

Researchers at the Sainsbury Wellcome Centre (SWC) at UCL have unveiled the precise brain mechanisms that enable animals to overcome instinctive fears. Published today in Science, the study in mice could have implications for developing therapeutics for fear-related disorders such as phobias, anxiety and post-traumatic stress disorder (PTSD).

The research team, led by Dr Sara Mederos and Professor Sonja Hofer, mapped out how the brain learns to suppress responses to perceived threats that prove harmless over time. 

“Humans are born with instinctive fear reactions, such as responses to loud noises or fast-approaching objects,” explains Dr Mederos, Research Fellow in the Hofer Lab at SWC. “However, we can override these instinctive responses through experience – like children learning to enjoy fireworks rather than fear their loud bangs. We wanted to understand the brain mechanisms that underlie such forms of learning”.

Using an innovative experimental approach, the team studied mice presented with an overhead expanding shadow that mimicked an approaching aerial predator. Initially, the mice sought shelter when encountering this visual threat. However, with repeated exposure and no actual danger, the mice learned to remain calm instead of escaping, providing researchers with a model to study the suppression of fear responses. 

Based on the lab’s previous work, the team knew that the ventrolateral geniculate nucleus (vLGN) could suppress fear reactions when active and was able to track knowledge of previous experience of threat. The vLGN also receives strong input from visual areas in the cerebral cortex, and so the researchers explored whether this neural pathway had a role in learning not to fear a visual threat. 

The study revealed two key components in this learning process: (1) specific regions of the visual cortex proved essential for the learning process, and (2) a brain structure called the ventrolateral geniculate nucleus (vLGN) stores these learning-induced memories.

“We found that animals failed to learn to suppress their fear responses when specific cortical visual areas were inactivated. However, once the animals had already learned to stop escaping, the cerebral cortex was no longer necessary,” explained Dr Mederos.

“Our results challenge traditional views about learning and memory,” notes Professor Hofer, senior author of the study. “While the cerebral cortex has long been considered the brain’s primary centre for learning, memory and behavioural flexibility, we found the subcortical vLGN and not the visual cortex actually stores these crucial memories. This neural pathway can provide a link between cognitive neocortical processes and ‘hard-wired’ brainstem-mediated behaviours, enabling animals to adapt instinctive behaviours.”

The researchers also uncovered the cellular and molecular mechanisms behind this process. Learning occurs through increased neural activity in specific vLGN neurons, triggered by the release of endocannabinoids – known to regulate mood and memory. This release decreases inhibitory input to vLGN neurons, resulting in heightened activity in this brain area when the visual threat stimulus is encountered, which suppresses fear responses. 

The implications of this discovery extend beyond the laboratory. “Our findings could also help advance our understanding of what is going wrong in the brain when fear response regulation is impaired in conditions such as phobias, anxiety and PTSD. While instinctive fear reactions to predators may be less relevant for modern humans, the brain pathway we discovered exists in humans too,” explains Professor Hofer. “This could open new avenues for treating fear disorders by targeting vLGN circuits or localised endocannabinoid systems.”

The research team is now planning to collaborate with clinical researchers to study these brain circuits in humans, with the hope of someday developing new, targeted treatments for maladaptive fear responses and anxiety disorders.

Source: Sainsbury Wellcome Centre

Epidural Steroid Injections for Chronic Back Pain

Photo by Cottonbro on Pexels

The American Academy of Neurology (AAN) has developed a new systematic review to summarise for neurologists and other clinicians the evidence for epidural steroid injections and whether they reduce pain and disability for people with certain kinds of chronic back pain. The systematic review is published online in Neurology®.

It updates a 2007 assessment by the AAN. With an epidural steroid injection, a steroid or corticosteroid medication is injected into the epidural space with the aim of helping reduce certain kinds of back pain.

“Chronic back pain is common and can negatively impact a person’s quality of life, making it difficult to move, sleep and participate in daily activities,” said author Carmel Armon, MD, of Loma Linda University School of Medicine in California and a Fellow of the American Academy of Neurology. “In our review, studies show epidural steroid injections may have limited efficacy. They may modestly reduce pain in some situations for up to three months and reduce disability for some people for up to six months or more.”

For the review, researchers analysed all available studies over a 16-year period. A total of 90 studies were examined. The review focused on the use of epidural steroid injections to reduce pain for people with radiculopathy and spinal stenosis. Radiculopathy is a condition caused by a pinched nerve in your spine. Spinal stenosis is a condition where spinal cord or nerves have become compressed because the space around the spinal cord has become too small. For people with radiculopathy, the review says studies show epidural steroid injections may be effective at modestly reducing pain and disability for up to three months after the procedure.

When compared to people not receiving the treatment, 24% more people receiving the treatment reported reduced pain, and 16% more reported reduced disability for up to 3 months. The treatment may also reduce disability for up to six months or more, with 11% more of those treated reporting reduced disability. Most of the reviewed studies looked at people with radiculopathy in their lower backs, so it is unclear how effective the treatment is for those with radiculopathy in their necks. For people with spinal stenosis, studies show epidural steroid injections might modestly reduce disability for up to six months or more after the procedure.

When compared to people not receiving the treatment, 26% more people receiving the treatment reported reduced disability up to three months, and 12% more for up to six months or more. The treatment was not found to reduce pain for up to three months. All studies looked at people with stenosis in their lower backs, so researchers do not know how effective the treatment is for people with stenosis in their necks.

“Our review affirms the limited effectiveness of epidural steroid injections in the short term for some forms of chronic back pain,” said author Pushpa Narayanaswami, MD, of Beth Israel Deaconess Medical Center in Boston and a Fellow of the American Academy of Neurology. “We found no studies looking at whether repeated treatments are effective or examining the effect of treatment on daily living and returning to work. Future studies should address these gaps.”

Source: American Academy of Neurology

The Surprising Link between Muscle Signalling and Brain Memory

New research shows that how a network of subcellular structures is responsible for transmitting signals in neurons. This movie shows 3D renderings of these structures in high-resolution 3D electron microscopy images of fruit fly neurons. The endoplasmic reticulum (green), plasma membrane (blue), mitochondria (pink), microtubules (tan), and ER-plasma membrane contacts (magenta) are segmented from FIB-SEM datasets of a Drosophila melanogaster MBON1 neuron. Credit: Benedetti et al.

New research led by the Lippincott-Schwartz Lab shows that a network of subcellular structures similar to those responsible for propagating molecular signals that make muscles contract are also responsible for transmitting signals in the brain that may facilitate learning and memory.

“Einstein said that when he uses his brain, it is like he is using a muscle, and in that respect, there is some parallel here,” says Janelia Senior Group Leader Jennifer Lippincott-Schwartz. “The same machinery is operating in both cases but with different readouts.” The research appears in the journal Cell.

The first clue about the possible connection between brain and muscle cells came when Janelia scientists noticed something strange about the endoplasmic reticulum, or ER – the membranous sheets and folds inside cells that are crucial for many cellular functions.

Research scientist Lorena Benedetti was tracking molecules at high resolution along the surface of the ER in mammalian neurons when she saw that the molecules were tracing a repeating, ladder-like pattern along the entire length of the dendrites.

Around the same time, Senior Group Leader Stephan Saalfeld alerted Lippincott-Schwartz to high-resolution 3D electron microscopy images of neurons in the fly brain where the ER was also forming regularly spaced, transversal structures.

This movie shows time-lapse high-resolution imaging in neurons, revealing the dynamic behavior of ER tubules contrasted with the persistence of ER-PM junctional sites over time. Time-lapse acquired using 2D lattice-SIM in burst mode of HaloTag-Sec61β (labeled with JF585 HaloTag-ligand) expressing neurons. Scale bars: 0.5 μm. Credit: Benedetti et al.

The ER normally appears like a huge, dynamic net, so as soon as Lippincott-Schwartz saw the structures, she knew her lab needed to figure out what they were for.

“In science, structure is function,” says Lippincott-Schwartz, who also heads Janelia’s 4D Cellular Physiology research area. “This is an unusual, beautiful structure that we are seeing throughout the whole dendrite, so we just had this feeling that it must have some important function.”

The researchers, led by Benedetti, started by looking at the only other area of the body known to have similar, ladder-like ER structures: muscle tissue. In muscle cells, the ER and the plasma membrane – the outer membrane of the cell – meet at periodic contact sites, an arrangement controlled by a molecule called junctophilin.

Using high-resolution imaging, the researchers discovered that dendrites also contain a form of junctophilin that controls contact sites between their ER and plasma membrane. Further, the team found that the same molecular machinery controlling calcium release at muscle cells’ contact sites – where calcium drives muscle contraction – was also present at dendrite contact sites – where calcium regulates neuronal signalling.

Because of these clues, the researchers had a hunch that the molecular machinery at the dendritic contact sites must also be important for transmitting calcium signals, which cells use to communicate. They suspected that the contact sites along the dendrites might act like a repeater on a telegraph machine: receiving, amplifying, and propagating signals over long distances. In neurons, this could explain how signals received at specific sites on dendrites are relayed to the cell body hundreds of micrometres away.  

“How that information travels over long distances and how the calcium signal gets specifically amplified was not known,” says Benedetti. “We thought that ER could play that role, and that these regularly distributed contact sites are spatially and temporally localised amplifiers: they can receive this calcium signal, locally amplify this calcium signal, and relay this calcium signal over a distance.”

The researchers found that this process is triggered when a neuronal signal causes calcium to enter the dendrite through voltage-gated ion channel proteins, which are positioned at the contact sites. Although this initial calcium signal dissipates quickly, it triggers the release of additional calcium from the ER at the contact site.

Source: Howard Hughes Medical Institute

Thrombolytic Drug Still Effective up to 24 Hours after Ischaemic Stroke Onset

Credit: American Heart Association

The thrombolytic medication, alteplase, improved stroke patients’ recovery by more than 50% when given up to 24 hours after the beginning of an ischaemic stroke, according to preliminary late-breaking science presented at the American Stroke Association’s International Stroke Conference 2025.

These results give hope to stroke patients worldwide who may not be able to access thrombolytic medications within the approved time window, which in China is within 4.5 hours, said the trial’s principal investigator Min Lou, MD, PhD, a professor at the Second Affiliated Hospital of Zhejiang University’s School of Medicine in China.

In the US, alteplase is approved to treat stroke within three hours of symptom onset and is recommended for use up to 4.5 hours for select patients. Other research has indicated it may also work well in some patients 4.5 to 9 hours after stroke onset.

The American Heart Association/American Stroke Association 2019 Guidelines for the Early Management of Patients with Acute Ischemic Stroke note that IV alteplase within 4.5 hours of stroke onset is the standard of care for most ischaemic stroke patients in the United States.

Researchers enrolled 372 stroke patients whose symptoms began 4.5 hours to 24 hours earlier. They used widely available CT perfusion imaging (advanced brain scanning) to confirm that these patients still had brain tissue that could recover with treatment. Participants were randomised to receive alteplase, while the other received standard stroke care of antiplatelet therapy at the discretion of the investigator, based on the Chinese Guidelines for Diagnosis and Treatment of Acute Ischemic Stroke 2018. Functional recovery was assessed at 90 days.

“We believe these findings mean more people may return to normal or near-normal lives after a stroke, even if they receive treatment later than originally thought beneficial,” Lou said. “This method of treatment could become the new standard, especially in hospitals that use CT perfusion imaging. This technology helps health care professionals see how blood flows in different parts of the brain after an ischemic stroke. This could extend treatment eligibility to millions more patients across the globe.”

The study found:

  • 40% of participants treated with alteplase had little to no disability after 90 days, compared to 26% of those who received standard care – a 54% higher chance of functional recovery.
  • Less than 3% of participants in either group received rescue mechanical clot removal as an additional treatment.
  • Rates of death were the same (10.8%) for both groups.
  • The risk of brain bleeding was higher among those who received alteplase than among participants who did not (3.8% vs. 0.5%), but researchers believe this is a manageable risk.

“We also need to look more closely at how safe and effective other clot-dissolving medications, like tenecteplase, are when given after a stroke, especially beyond the usual time frames. It’s also important to learn if our findings apply to other groups of people, especially in areas with different stroke risks and health care resources,” Lou explained.

Study limitations include the that both participants and researcher knew which treatment was being given, which could have introduced bias, and results may not be generalizable to patients outside of China.

Study design, background and details:

  • The study enrolled 372 stroke patients in a multicenter, prospective, randomized trial at 26 stroke centers in China.
  • The patient’s average age was 72 years, and 43% were women.
  • The trial used widely available CT perfusion imaging software to gauge salvageable brain tissue, making the findings more applicable to real-world clinical settings.
  • Enrolled patients were assigned to the alteplase group or a standard medical treatment group.
  • The primary outcome was a score of 0 or 1 on the modified Rankin scale, which scores disability from 0 (no symptoms) to 6 (death) at 90 days.

Study co-authors, funding and disclosures are available in the abstract.

Source: American Heart Association

Progress and Challenges in the Development of Brain Implants

Pixabay CC0

In a paper recently published in The Lancet Digital Health, a scientific team led by Stanisa Raspopovic from MedUni Vienna looks at the progress and challenges in the research and development of brain implants. New achievements in the field of this technology are seen as a source of hope for many patients with neurological disorders and have been making headlines recently. As neural implants have an effect not only on a physical but also on a psychological level, researchers are calling for particular ethical and scientific care when conducting clinical trials.

The research and development of neuroprostheses has entered a phase in which experiments on animal models are being followed by tests on humans. Only recently, reports of a paraplegic patient in the USA who was implanted with a brain chip as part of a clinical trial caused a stir. With the help of the implant, the man can control his wheelchair, operate the keyboard on his computer and use the cursor in such a way that he can even play chess. About a month after the implantation, however, the patient realised that the precision of the cursor control was decreasing and the time between his thoughts and the computer actions was delayed.

“The problem could be partially, but not completely, resolved – and illustrates just one of the potential challenges for research into this technology,” explains study author Stanisa Raspopovic from MedUni Vienna’s Center for Medical Physics and Biomedical Engineering, who published the paper together with Marcello Ienca (Technical University of Munich) and Giacomo Valle (ETH Zurich). “The questions of who will take care of the technical maintenance after the end of the study and whether the device will still be available to the patient at all after the study has been cancelled or completed are among the many aspects that need to be clarified in advance in neuroprosthesis research and development, which is predominantly industry-led.”

Protection of highly sensitive data

Neuroprostheses establish a direct connection between the nervous system and external devices and are considered a promising approach in the treatment of neurological impairments such as paraplegia, chronic pain, Parkinson’s disease and epilepsy. The implants can restore mobility, alleviate pain or improve sensory functions. However, as they form an interface to the human nervous system, they also have an effect on a psychological level: “They can influence consciousness, cognition and affective states and even free will. This means that conventional approaches to safety and efficacy assessment, such as those used in clinical drug trials, are not suitable for researching these complex systems. New models are needed to comprehensively evaluate the subjective patient experience and protect the psychological privacy of the test subjects,” Raspopovic points out.

The special technological features of neuroimplants, in particular the ability to collect and process neuronal data, pose further challenges for clinical validation and ethical oversight. Neural data is considered particularly sensitive and requires an even higher level of protection than other health information. Unsecured data transmission, inadequate data protection guidelines and the risk of hacker attacks are just some of the potential vulnerabilities that require special precautions in this context. “The use of neural implants cannot be reduced to medical risks,” summarises Stanisa Raspopovic. “We are only in the initial phase of clinical studies on these technological innovations. But questions of ethical and scientific diligence in dealing with this highly sensitive topic should be clarified now and not only after problems have arisen in test subjects or patients.”

Source: Medical University of Vienna

Could the Key to IBS Treatment Lie in the Brain?

Irritable bowel syndrome. Credit: Scientific Animations CC4.0

Although irritable bowel syndrome (IBS) affects about a tenth of the global population, the underlying causes and mechanisms of IBS remain unclear and thus treatments focus on symptom management. At Tokyo University of Science (TUS), Japan, Professor Akiyoshi Saitoh and his research group have spent the past decade exploring this topic. This study, published in the British Journal of Pharmacology, discovered that a class of drugs called opioid delta-receptor (DOP) agonists may help alleviate IBS symptoms by targeting the central nervous system rather than acting directly on the intestine.

One of the main motivations for this study was the growing evidence linking IBS closely to psychological stress. Saitoh’s group aimed to address this potential root cause by focusing on finding a novel animal model for this condition. In a 2022 study, they developed a mice model repeatedly exposed to psychological stress – using a method called chronic vicarious social defeat stress (cVSDS) – which developed symptoms similar to a type of IBS called IBS-D. These symptoms included overly active intestines and heightened sensitivity to abdominal pain, even though their organs showed no physical damage. The cVSDS animal model involved having the subject mouse repeatedly witness a territorial, aggressive mouse defeating a cage mate, inducing indirect chronic stress.

Using the cVSDS model, the researchers sought to determine whether DOP in the brain, which is closely linked to pain and mood regulation, could serve as promising drug targets for treating stress-induced IBS. To achieve this, they performed a series of detailed experiments to observe the effects of DOP agonists on IBS symptoms and chemical signaling in the brain. Some experiments involved measuring the speed of a charcoal meal through the intestine to assess gastrointestinal motility and evaluate the impact of stress or treatments on bowel movement speed, along with directly measuring neurotransmitter concentrations using in vivo brain microdialysis. This revealed that re-exposure to VSDS increased glutamate levels in the insular cortex, but these elevated levels were normalised with DOP agonists.

According to the results, the administration of DOP agonists helped relieve abdominal pain and regulated bowel movements in cVSDS mice. Interestingly, applying the DOP agonists directly to a specific brain region called the insular cortex had similar effects on IBS symptoms as systemic treatment. “Our findings demonstrated that DOP agonists acted directly on the central nervous system to improve diarrhoea-predominant IBS symptoms in mice, and suggest that the mechanism of action involves the regulation of glutamate neurotransmission in the insular cortex,” highlights Saitoh.

Taken together, the continued research by Saitoh’s group on this topic could pave the way for effective treatments for IBS. “DOP agonists could represent a groundbreaking new IBS treatment that not only improves IBS-like symptoms but also provides anti-stress and emotional regulation effects. In the future, we would like to conduct clinical developments with the goal of expanding the indication of DOP agonists for IBS, in addition to depression,” remarks Saitoh.

Compared to currently available IBS treatments, such as laxatives, antidiarrhoeals, analgesics, and antispasmodics, targeting the underlying stress with DOP agonists may offer a more definitive solution with minimal adverse effects. Further clarification of the roles of stress and brain chemistry in the development of IBS will be essential in achieving this much-needed medical breakthrough. With promising prospects, future studies will translate Saitoh’s group’s findings to humans, bringing great relief to those affected by IBS.

Source: Tokyo University of Science

New Tech could Cut Epilepsy Misdiagnoses by up to 70% Using Routine EEGs

Source: Pixabay

Doctors could soon reduce epilepsy misdiagnoses by up to 70% using a new tool that turns routine electroencephalogram, or EEG, tests that appear normal into highly accurate epilepsy predictors, a Johns Hopkins University study has found.

By uncovering hidden epilepsy signatures in seemingly normal EEGs, the tool could significantly reduce false positives, seen in around 30% of cases globally, and spare patients from medication side effects, driving restrictions, and other quality-of-life challenges linked to misdiagnoses.

“Even when EEGs appear completely normal, our tool provides insights that make them actionable,” said Sridevi V. Sarma, a Johns Hopkins biomedical engineering professor who led the work. “We can get to the right diagnosis three times faster because patients often need multiple EEGs before abnormalities are detected, even if they have epilepsy. Accurate early diagnosis means a quicker path to effective treatment.”

A report of the study is newly published in Annals of Neurology.

Epilepsy causes recurrent, unprovoked seizures triggered by bursts of abnormal electrical activity in the brain. Standard care involves scalp EEG recordings during initial evaluations. These tests track brainwave patterns using small electrodes placed on the scalp.

Clinicians partly rely on EEGs to diagnose epilepsy and decide whether patients need anti-seizure medications. However, EEGs can be challenging to interpret because they capture noisy signals and because seizures rarely occur during the typical 20 to 40 minutes of an EEG recording. These characteristics makes diagnosing epilepsy subjective and prone to error, even for specialists, Sarma explained.

To improve reliability, Sarma’s team studied what happens in the brains of patients when they are not experiencing seizures. Their tool, called EpiScalp, uses algorithms trained on dynamic network models to map brainwave patterns and identify hidden signs of epilepsy from a single routine EEG.

“If you have epilepsy, why don’t you have seizures all the time? We hypothesized that some brain regions act as natural inhibitors, suppressing seizures. It’s like the brain’s immune response to the disease,” Sarma said.

The new study analyzed 198 epilepsy patients from five major medical centers: Johns Hopkins Hospital, Johns Hopkins Bayview Medical Center, University of Pittsburgh Medical Center, University of Maryland Medical Center, and Thomas Jefferson University Hospital. Out of these 198 patients in the study, 91 patients had epilepsy while the rest had non-epileptic conditions mimicking epilepsy.

When Sarma’s team reanalysed the initial EEGs using EpiScalp, the tool ruled out 96% of those false positives, cutting potential misdiagnoses among these cases from 54% to 17%.

“This is where our tool makes a difference because it can help us uncover markers of epilepsy in EEGs that appear uninformative, reducing the risk of patients being misdiagnosed and treated for a condition they don’t have,” said Khalil Husari, co-senior author and assistant professor of neurology at Johns Hopkins. “These patients experienced side effects of the anti-seizure medication without any benefit because they didn’t have epilepsy. Without the correct diagnosis, we can’t find out what’s actually causing their symptoms.”

In certain cases, misdiagnosis happens due to misinterpretation of EEGs, Husari explained, as doctors may overdiagnose epilepsy to prevent the dangers of a second seizure. But in some cases, patients experience nonepileptic seizures, which mimic epilepsy. These conditions can often be treated with therapies that do not involve epilepsy medication.

In earlier work, the team studied epileptic brain networks using intracranial EEGs to demonstrate that the seizure onset zone is being inhibited by neighboring regions in the brain when patients are not seizing. EpiScalp builds on this research, identifying these patterns from routine scalp EEGs.

Traditional approaches to improve EEG interpretation often focus on individual signals or electrodes. Instead, EpiScalp analyses how different regions of the brain interact and influence one another through a complex network of neural pathways, said Patrick Myers, first author and doctoral student in biomedical engineering at Johns Hopkins.

“If you just look at how nodes are interacting with each other within the brain network, you can find this pattern of independent nodes trying to cause a lot of activity and the suppression from nodes in a second region, and they’re not interacting with the rest of the brain,” Myers said. “We check whether we can see this pattern anywhere. Do we see a region in your EEG that has been decoupled from the rest of the brain’s network? A healthy person shouldn’t have that.”

Source: Johns Hopkins University

Elevated Opioid Neurotransmitter Activity Seen in Patients with Anorexia

Photo from Freepik.

A study conducted at Turku PET Centre in Finland and published in showed that changes in the functioning of opioid neurotransmitters in the brain may underlie anorexia.

Anorexia nervosa is a serious psychiatric disorder characterised by restricted eating, fear of gaining weight, and body image disturbances, which may lead to severe malnutrition, depression and anxiety. This new study from Turku PET Centre, published in Molecular Psychiatry, shows how changes in neurotransmitter function in the brain may underlie anorexia.

“Opioid neurotransmission regulates appetite and pleasure in the brain. In patients with anorexia nervosa, the brain’s opioidergic tone was elevated in comparison with healthy control subjects. Previously we have shown that in obese patients the activity of the tone of this system is lowered. It is likely that the actions of these molecules regulate both the loss and increase in appetite,” says Professor Pirjo Nuutila from the University of Turku.

Number of opioid receptors in the brain (top row) and sugar intake (bottom row) in patients with anorexia nervosa. Credit: University of Turku

In addition, the researchers measured the brain’s glucose uptake. The brain accounts for about 20% of the body’s total energy consumption, so the researchers were interested in how a reduction in the energy intake affects the brain’s energy balance in anorexia.

“The brains of patients with anorexia nervosa used a similar amount of glucose as the brains of the healthy control subjects. Although being underweight burdens physiology in many ways, the brain tries to protect itself and maintain its ability to function for as long as possible,” says Professor Lauri Nummenmaa from Turku PET Centre and continues:

“The brain regulates appetite and feeding, and changes in brain function are associated with both obesity and low body weight. Since changes in opioid activity in the brain are also connected to anxiety and depression, our findings may explain the emotional symptoms and mood changes associated with anorexia nervosa.”

Source: University of Turku

Nerve Stimulation Fails When the Brain is not ‘Listening’

A small device worn on the body can stimulate the nervous system via electrodes on the ear. Credit: Vienna University of Technology.

Various diseases can be treated by stimulating the vagus nerve in the ear with electrical signals, but the results can be ‘hit or miss’. A study recently published in Frontiers in Physiology has now shown that the electrical signals must be synchronised with the body’s natural rhythms – heartbeat and breathing.

Some health problems, from chronic pain and inflammation to neurological diseases, can also be treated by nerve stimulation, for example with the help of electrodes that are attached to the ear and activate the vagus nerve. This method is sometimes referred to as an ‘electric pill’.

However, vagus nerve stimulation does not always work the way it is supposed to. A study conducted by TU Wien (Vienna) in cooperation with the Vienna Private Clinic now shows how this can be improved: Experiments demonstrate that the effect is very good when the electrical stimulation is synchronised with the body’s natural rhythms – the actual heartbeat and breathing.

The ‘electric pill’ for the parasympathetic nervous system

The vagus nerve plays an important role in our body: it is the longest nerve of the parasympathetic nervous system, the part of the nervous system that is significantly involved in the precise control of the internal organs and blood circulation, and is responsible for recovery and building up the body’s own reserves. A branch of the vagus nerve also leads from the brain directly into the ear, which is why small electrodes in the ear can be used to activate the vagus nerve, stimulate the brain and thus influence various functions of the body.

“However, it turns out that this stimulation does not always produce the expected results,” says Prof Eugenijus Kaniusas from the Institute of Biomedical Electronics at TU Wien. “The electrical stimulation does not have an effect on the nervous system at all times. You could say that the brain is just not always listening. It’s as if there is a gate to the control centre of the nervous system that is sometimes open and then closed again, and this can change in less than a second.”

Five people have now been examined in a pilot study. Their vagus nerve was electrically activated to lower their heart rate. It is already known from previous studies that heart rate is a potential indicator of whether stimulation therapy is beneficial or not.

It was shown that the temporal connection between the stimulation and the heartbeat plays a decisive role. If the vagus nerve is stimulated at a rhythm that is not synchronised with the heartbeat, hardly any effect can be observed. However, if the stimulation signals are always applied when the heart is contracting (during systole), a strong effect can be observed – much stronger than if stimulation is applied during the relaxation phase of the heart, diastole.

Breathing is also important in this context: the stimulation was significantly more effective during the inhalation phase than during the exhalation phase.

“Our results show that synchronising vagus nerve stimulation with the heartbeat and breathing rhythm significantly increases effectiveness. This could help to improve the success of treatment for chronic illnesses, especially for those who have not previously responded to this therapy for reasons that are as yet unexplained,” says Eugenijus Kaniusas.

Larger clinical studies to follow

If nerve stimulation can be customised electronically so that it is tailored to the body’s own individual rhythms at any given time, it should be possible to achieve significantly greater successes than has been possible to date. Future studies should examine larger and clinically relevant patient groups and develop even more precise algorithms in order to be able to tailor the stimulation even more precisely to individual needs.

“This technology could be an effective and non-invasive way of modulating the autonomic nervous system in a targeted and gentle manner – a potential milestone in the neuromodulatory treatment of various chronic diseases,” believes Dr Joszef Constantin Szeles from the Vienna Private Clinic.

Source: Vienna University of Technology