Tag: cognition

Why do Some Parkinson’s Disease Treatments Affect Decision Making?

Created with Gencraft. CC4.0

Parkinson’s disease, a debilitating nervous system disorder, is treated with medications that sometimes cause impaired decision-making and poor impulse control. Now, researchers from Fujita Health University in Japan have identified a structure in the brain called the external globus pallidus which may be responsible for this side effect, paving the way for new treatments.

Parkinson’s disease (PD), also known simply as Parkinson’s, is a disorder of the nervous system that affects millions of people worldwide. The nerve cell damage associated with Parkinson’s can cause tremors, slowed movements, problems with balance, and many other symptoms which worsen gradually over time. Although there is no cure, there are medications available that can treat PD symptoms. Some of these medications, however, have previously unexplained side effects – including impaired decision-making that leads to potentially harmful behaviours such as pathological gambling, binge eating and compulsive shopping.

Now, in a study published in the International Journal of Molecular Sciences, researchers at Fujita Health University in Japan, led by Assistant Professor Hisayoshi Kubota from the Division of Behavioral Neuropharmacology, International Center for Brain Science (ICBS), Fujita Health University, have investigated the mechanism by which a drug called pramipexole or PPX impairs the decision-making process in mice with Parkinson’s disease. The research was co-authored by Professor Taku Nagai from the Division of Behavioral Neuropharmacology, International Center for Brain Science (ICBS), and Professor Hirohisa Watanabe from the Department of Neurology, School of Medicine, both at Fujita Health University.

To take a closer look at the findings of this study, we first need to understand how PPX works to alleviate PD symptoms. PD mainly results from a loss of nerve cells or neurons that produce a compound called dopamine. Some neurons are dependent on dopamine for their regular functioning – they have structures called ‘dopamine receptors’ which can be thought of as locks which can then be activated using dopamine as the ‘key’. Drugs like PPX can imitate the function of dopamine and bind to these receptors instead, especially in patients with PD who lack dopamine-producing neurons.

To study the effects of PPX on PD, the researchers injected the brains of mice with a toxin called 6-hydroxydopamine (or 6-OHDA). 6-OHDA damages neurons in a very similar manner to that observed in the brains of patients with PD. The mice were treated with PPX and then subjected to a touchscreen-based ‘gambling task’ to test their decision-making skills. Interestingly, these mice picked the high-risk/high-reward option much more often – they opted for a disadvantageous outcome where they received a large reward (of strawberry milkshake), which also comes with an increased risk of a large punishment by exposure to flashing lights.

But which part of the brain is responsible for this behaviour? Investigating the brains of mice treated with PPX revealed that a region deep inside the brain called the external globus pallidus (GPe) was hyperactivated, or showed a much higher level of neuron activity. The researchers then chemically inhibited the neurons in the GPe, which actually reduced disadvantageous risk-taking activity in the mice. This proved that hyperactivation of the GPe was indeed responsible for poor decision-making in the mice treated with PPX.

This study has huge implications for treating patients with Parkinson’s disease. “Our findings could lead to the development of new medications or interventions that specifically target the external globus pallidus,” explains Dr. Kubota. “This would help to prevent or reduce decision-making impairments in Parkinson’s disease patients.“

Besides helping medical professionals develop better treatments for Parkinson’s disease, these findings can also help improve awareness among affected patients, their families, as well as the general public. Dr. Kubota, explains that “Investigating how Parkinson’s disease medications affect decision-making will help the public to better understand the complexity of the disease and its treatment.” He also says “This will benefit patients, their families and carers, and motivate them to consider early care and preventive strategies.”

These findings shed new light on the complex processes in the brain that aid our everyday decision-making skills, and promise to improve quality of life for patients affected by Parkinson’s disease. Maybe we can take away some important lessons from this study as well, and think twice before we indulge in poor decision-making in our daily lives!

Intracranial EEG Captures Neurons Resonating as They Turn Words into Thoughts

The lines on this diagram of the brain represent connections between various areas of the cerebral cortex involved in language processing. When we read, the neurons in these areas fire in precise synchronicity, a phenomenon known as “co-rippling.” Photo credit: UC San Diego Health Sciences

Researchers at University of California San Diego School of Medicine have brought us closer to solving how the brain processes information from specialised areas into a whole. By delving into the brain with intracranial electroencephalography, they observed how neurons synchronise across the human brain while reading. The findings are published in Nature Human Behaviour and are also the basis of a thesis by UC San Diego School of Medicine doctoral candidate Jacob Garrett.

“How the activity of the brain relates to the subjective experience of consciousness is one of the fundamental unanswered questions in modern neuroscience,” said study senior author Eric Halgren, Ph.D., professor in the Departments of Neurosciences and Radiology at UC San Diego School of Medicine. “If you think about what happens when you read text, something in the brain has to turn that series of lines into a word and then associate it with an idea or an object. Our findings support the theory that this is accomplished by many different areas of the brain activating in sync.”

This synchronisation of different brain areas, called “co-rippling” is thought to be essential for binding different pieces of information together to form a coherent whole. In rodents, co-rippling has been observed in the hippocampus, the part of the brain that encodes memories. In humans, Halgren and his colleagues previously observed that co-rippling also occurs across the entire cerebral cortex.

To examine co-rippling at the mechanistic level, Ilya Verzhbinsky, an MD/PhD candidate completing his research in Halgren’s lab, led a study published in PNAS that looked at what happens to single neurons firing in different cortical areas during ripples. The present study looks at the phenomenon with a wider lens, asking how the many billions of neurons in the cortex are able to coordinate this firing to process information.

“There are 16 billion neurons in the cortex – double the number of people on Earth,” said Halgren. “In the same way a large chorus needs to be organised to sound as a single entity, our brain neurons need to be coordinated to produce a single thought or action. Co-rippling is like neurons singing on pitch and in rhythm, allowing us to integrate information and make sense of the world. Unless they’re co-rippling, these neurons have virtually no effect on the other, but once ripples are present about two thirds of neuron pairs in the cortex become synchronised. We were surprised by how powerful the effect was.”

Co-rippling in the cortex has been difficult to observe in humans due to limitations of noninvasive brain scanning. To work around this problem, the researchers used an approach called intracranial electroencephalography (EEG) scanning, which measures the electrical activity of the brain from inside the skull. The team studied a group of 13 patients with drug-resistant epilepsy who were already undergoing EEG monitoring as part of their care.

Participants were shown a series of animal names interspersed with strings of random consonants or nonsense fonts and then asked to press a button to indicate the animal whose name they saw. The researchers observed three stages of cognition during these tests: an initial hierarchical phase in visual areas of the cortex in which the participant could see the word without conscious understanding of it; a second stage in which this information was “seeded” with co-ripples into other areas of the cortex involved in more complex cognitive functions; and a final phase, again with co-ripples, where the information across the cortex is integrated into conscious knowledge and a behavioural response – pressing the button.

The researchers found that throughout the exercise, co-rippling (~100ms-long ~90Hz oscillations) occurred between the various parts of the brain engaged in these cognitive stages, but the rippling was stronger when the participants were reading real words.

The study’s findings have potential long-term implications for the treatment of neurological and psychiatric disorders, such as schizophrenia, which are characterised by disruptions in these information integration processes.

“It will be easier to find ways to reintegrate the mind in people with these disorders if we can better understand how minds are integrated in typical, healthy cases,” added Halgren.

More broadly, the study’s findings have significant implications for our understanding of the link between brain function and human experience.

“This is a fundamental question of human existence and gets at the heart of the relationship between mind and brain,” said Halgren. “By understanding how our brain’s neurons work together, we can gain new insights into the nature of consciousness itself.”

Source: University of California San Diego

What Happens When the Brain Loses a Hub?

Photo by Jafar Ahmed on Unsplash

A University of Iowa-led team of international neuroscientists have obtained the first direct recordings of the human brain in the minutes before and after a brain hub crucial for language meaning was surgically disconnected. The results reveal the importance of brain hubs in neural networks and the remarkable way in which the human brain attempts to compensate when a hub is lost, with immediacy not previously observed. The findings were reported recently in the journal Nature Communications.

Hubs are critical for connectivity

The human brain has hubs – the intersection of many neuronal pathways that help coordinate brain activity required for complex functions like understanding and responding to speech. But debate has reigned as to whether highly interconnected brain hubs are irreplaceable for certain brain functions. By some accounts the brain, as an already highly interconnected neural network, can in principle immediately compensate for the loss of a hub, in the same way that traffic can be redirected around a blocked-off city centre.

With a rare experimental opportunity, the UI neurosurgical and research teams led by Matthew Howard III, MD, professor and DEO of neurosurgery, and Christopher Petkov, PhD, professor and vice chair for research in neurosurgery, have achieved a breakthrough in understanding the necessity of a single hub. By obtaining evidence for what happens when a hub required for language meaning is lost, the researchers showed both the intrinsic importance of the hub as well as the remarkable and rapid ability of the brain to adapt and at least partially attempt to immediately compensate for its loss.

Evaluating the impact of losing a brain hub

The study was conducted during surgical treatment of two patients with epilepsy. Both patients were undergoing procedures that required surgical removal of the anterior temporal lobe – a brain hub for language meaning – to allow the neurosurgeons access to a deeper brain area causing the patients’ debilitating epileptic seizures. Before this type of surgery, neurosurgery teams often ask the patients to conduct speech and language tasks in the operating room as the team uses implanted electrodes to record activity from parts of the brain close to and distant from the planned surgery area. These recordings help the clinical team effectively treat the seizures while limiting the impact of the surgery on the patient’s speech and language abilities.

Typically, the recording electrodes are not needed after the surgical resection procedure and are removed. The innovation in this study was that the neurosurgery team was able to safely complete the procedure with the recording electrodes left in place or replaced to the same location after the procedure. This made it possible to obtain rare pre- and post-operative recordings allowing the researchers to evaluate signals from brain areas far away from the hub, including speech and language areas distant from the surgery site. Analysis of the change in responses to speech sounds before and after the loss of the hub revealed a rapid disruption of signaling and subsequent partial compensation of the broader brain network.

“The rapid impact on the speech and language processing regions well removed from the surgical treatment site was surprising, but what was even more surprising was how the brain was working to compensate, albeit incompletely within this short timeframe,” says Petkov, who also holds an appointment at Newcastle University Medical School in the UK.

The findings disprove theories challenging the necessity of specific brain hubs by showing that the hub was important to maintain normal brain processing in language.

“Neurosurgical treatment and new technologies continue to improve the treatment options provided to patients,” says Howard, who also is a member of the Iowa Neuroscience Institute.

“Research such as this underscores the importance of safely obtaining and comparing electrical recordings pre and post operatively, particularly when a brain hub might be affected.”

According to the researchers, the observation on the nature of the immediate impact on a neural network and its rapid attempt to compensate provides evidence in support of a brain theory proposed by Professor Karl Friston at University College London, which posits that any self-organising system at equilibrium works towards orderliness by minimising its free energy, a resistance of the universal tendency towards disorder.

These neurobiological results following human brain hub disconnection were consistent with several predictions of this and related neurobiological theories, showing how the brain works to try to regain order after the loss of one of its hubs.

Source: University of Iowa Health Care

Treatment can Prevent Brain Impacts of Neonatal Hypoglycaemia

Man holding newborn baby
Photo by Jonathan Borba on Unsplash

Long-term brain damage resulting from neonatal hypoglycaemia can be warded off with proper treatment such as later education and dextrose gel after birth, new studies have found.

The study is the first of its kind to show that stabilising blood sugar levels in neonatal hypoglycaemia prevents brain damage.

Hypoglycaemia is very common, affecting more than one in six babies. Since glucose is the main energy source for the brain and the body, untreated low blood sugar can cause adverse effects on a child’s neurodevelopment up to the age of 4.5 years old.

While hypoglycaemia is known to alter early development, there has been a significant gap in our understanding of how hypoglycaemia can alter a child’s development after early childhood. A study in JAMA investigated the long-term impact on brain development in mid-childhood – ages 9 to 10 – and found that, compared to peers, there was no significant difference in academic outcomes for children exposed to hypoglycaemia as newborns.

“Rich pre-school and school experiences may help a child’s brain to re-organise and improve their academic abilities up to the developmental milestones of their peers,” said Professor Ben Thompson, who is part of the research team.

Following 480 children born at risk of neonatal hypoglycaemia, researchers assessed each child at aged nine to 10 in five key areas: academic achievement, executive function, visual-motor function, psychosocial adaptation, and general health. All child participants were involved in previous studies, providing researchers with information on their neuro-development outcomes at two and 4.5 years old.

This ability to catch-up in neuro-cognitive function could be because of the brain’s plasticity, the researchers suggest.

“It’s a big relief to know that babies who are born with and treated for a condition as common as hypoglycaemia are not likely to suffer long-term brain damage,” Prof Thompson said.

The researchers have also continued studying the efficacy of dextrose gel to treat low blood sugar in the first 48-hours of a newborn’s life, avoiding the need for babies to go to newborn intensive care units immediately after delivery.

In an additional study published in JAMA, the team assessed the later risks of dextrose gel as a treatment for hypoglycaemia in infancy, and found change to the risk of neuro-sensory impairment at age two. This treatment continues to be widely used in a growing number of countries, including Canada, Australia, the United Kingdom and the United States.

Source: University of Waterloo

Brain Surgeons versus Rocket Scientists: Who’s Brainier?

Source: Sammy Williams on Unsplash

A light-hearted research article published in the Christmas edition of the BMJ sought to see once for all who is ‘brainier’: brain surgeons versus rocket scientists.

Brain surgeons and rocket scientists are often put on a pedestal as the exemplars of intellectual endeavour. But which of them is smarter and deserves the accolade more? Or at all? A group of neurosurgeons – who were, of course, totally unbiased – decided to resolve this conundrum.

Delving into the background of the phrases, they wrote that, “The phrase ‘It’s not rocket science’ is thought to have originated in America in the 1950s when German rocket scientists were brought over to support the developing space program and design of military rockets,” a research team led by University College London neuroscientist Inga Usher explained in their new paper.

“The origin of ‘It’s not brain surgery’ is less clear. It is tempting to speculate that the pioneering techniques of the polymath and neurosurgeon Harvey Cushing captured the attention of the public and promulgated the phrase.”

Their study aimed to settle the debate once and for all, and to “provide rocket scientists and brain surgeons with evidence to support their self-assuredness in the company of the other party.” The researchers tested participants across cognitive domains such as emotional discrimination and motor control. Eschewing an overall winner, they assessed the cognitive characteristics of each specialty using a validated online test, the Great British Intelligence Test (GBIT). This test had been used to measure distinct aspects of human cognition, spanning planning and reasoning, working memory, attention, and emotion processing abilities in more than 250 000 members of the British public. Rather than being an IQ test, it is intended to more finely discriminate aspects of cognitive ability. The dataset also let the researchers benchmark both specialties against the general population.

The neurosurgeons showed significantly higher scores than the aerospace engineers in semantic problem solving (possibly attributable to their familiarity with Latin and Greek scientific terminology). Aerospace engineers showed significantly higher scores in mental manipulation and attention. Domain scores for memory, spatial problem solving, problem solving speed, and memory recall speed were similar for both groups. When each group’s scores for the six domains were compared with those in the general population, only two differences were significant: the neurosurgeons’ problem solving speed was quicker and their memory recall speed was slower. No significant difference was found between aerospace engineers and the control population in any of the domains. 

The researchers observed that, “despite the stereotypes depicted by the phrases ‘It’s not rocket science’ and ‘It’s not brain surgery’, all three groups showed a wide range of cognitive abilities. In the original GBIT, 90% of Britons scored above average on at least one aspect of intelligence, illustrating the importance of studying multiple domains that make up a concept of intelligence rather than a single measure.”

The researchers came to the conclusion that, based on the findings, in situations that do not require rapid problem solving, it might be more correct to use the phrase “It’s not brain surgery”. It is possible that both neurosurgeons and aerospace engineers are unnecessarily placed on a pedestal and that “It’s a walk in the park” or another phrase unrelated to careers might be more appropriate. Other specialties might deserve to be on that pedestal, and future work should aim to determine the most deserving profession.

On a more serious note, they also considered that fewer young people are choosing surgery or engineering as a career path, and that such pursuits are commonly seen as ‘masculine’, deterring many females at an early stage. Their results however, showed that neither field differed significantly in cognitive aspects from the general public, which should help reassure future candidates that there is no ‘requirement’ for any type of personality trait.

Source: The British Medical Journal

Just Ten Minutes of Running Boosts Cognitive Function

Photo by Ketut Subiyanto on Pexels

Researchers have found that a mere ten minutes of running at moderate intensity boosts blood flow to the bilateral prefrontal cortex, improving cognitive function and mood. These findings, published in Scientific Reports, may contribute to the development of a wider range of treatment recommendations to benefit mental health.

Physical activity has many benefits as noted by a great body of evidence, such as the ability to lift mood, but in previous studies, cycling was often the form of exercise studied. However, running has always played an important role in the well-being of humans. Human running’s unique form and efficiency, which includes the ability to sustain this form of exertion (ie, by jogging as opposed to sprinting), and human evolutionary success are closely linked.

Despite this fact, researchers had not yet looked closely at the effects of running on brain regions that control mood and executive functions. “Given the extent of executive control required in coordinating balance, movement, and propulsion during running, it is logical that there would be increased neuronal activation in the prefrontal cortex and that other functions in this region would benefit from this increase in brain resources,” explained senior author Professor Hideaki Soya at the University of Tsukuba, Japan.

To test their hypothesis, the research team used the well-established Stroop Colour–Word Test and measured haemodynamic changes associated with brain activity while participants were engaged in each task. For example, in one task, incongruent information is shown, eg the word ‘red’ is written in green, and the participant must name the colour rather than read out the word. To do so, the brain must process both sets of information and inhibit the extraneous information. The Stroop interference effect was quantified by the difference in response times for this task and those for a simpler version of the task – stating the names of colour swatches.

The results show that, after ten minutes of moderate-intensity running, there was a significant reduction in Stroop interference effect time. Furthermore, bilateral prefrontal activation had significantly increased during the Stroop task and participants also reported being in a better mood. “This was supported by findings of coincident activations in the prefrontal cortical regions involved in mood regulation,” noted first author Chorphaka Damrongthai.

Given that many characteristics of the human prefrontal cortex are uniquely human, this study not only sheds light on the present benefits of running but also on the possible role that these benefits may have played in the evolutionary past of humans.

Source: EurekAlert!