Category: Lab Tests and Imaging

Low Power Mode for MRIs Could Cut Energy Use by Half

Credit: Pixabay CC0

Medical centres could save energy and reduce expenses by turning off MRIs and putting them in the lowest power mode instead of idling them when not in use, according to a new Radiology study.

Health care is responsible for up to 4.4% of global carbon emissions, and imaging contributes an outsized share due to its energy-intensive devices, especially MRI. A 2020 study found that three CTs and four MRIs used the same amount of energy per year as a town of 852 people, for example.

Though turning a machine off is better than idling, a substantial amount of MRI energy consumption occurs in “off” mode, which still draws a constant level of power for cooling. To address this, a new “power save” mode was developed that saves even more energy than the “off” mode by cycling cooling components on and off.

UC San Francisco researchers sought to compare energy consumption in the “idle,” “off” and “power save” modes. The researchers found that turning off MRIs overnight for 12 hours reduced their energy use by 25–33%, and that enabling the additional “power save” mode while the machine was off lowered power use by an additional 22–28%. Switching from idle directly to “power save” decreases energy use by 46–51%.

While just one company currently offers the “power save” mode while machines are off, it’s a design strategy worth replicating, the study noted.

“The results of this study demonstrate the potential energy and cost savings any radiology practice can obtain by using these simple power-down methods,” said assistant professor Sean Woolen, MD, first author on the study. “Our goal was to find ways for radiology departments worldwide to reduce their collective environmental footprint.”

Imaging has become increasingly central to medical decision-making, so it’s imperative to evaluate the design and operations of these machines in order to decarbonise health care, added Woolen.

Health Care Industry Would Save Millions

The study was made possible thanks to an academic-industry partnership comprising UCSF, Siemens Healthineers, Siemens USA, and Siemens Smart Infrastructure. Siemens provided technology and funding to equip MRI machines with power meters and install power monitoring software, and UCSF performed data collection and analysis.

The researchers equipped four outpatient MRI scanners from three different vendors with power meters and examined data over 39 days. They calculated energy output, costs (assuming a mean cost of $0.14 per kilowatt hour), and carbon use.

On an annual basis, switching a scanner from “idle” mode to “off” mode for 12 hours saved 12.3 to 21 megawatt hours (MWh) of electricity, where a megawatt is equal to 1000 kilowatts of electricity used continuously for an hour. This translated to annual savings of $1717 to $2943, and 8.7 to 14.9tonnes of C02-equivalent (MTCO2eq), a metric used to compare emissions of greenhouse gases based on their potential to contribute to global warming.

Switching from off to “power save” mode reduced energy use by an additional 8.8 to 11.4MWh and saved $1226 to $1594 and 6.2 to 8.1 MTC02eq per year.

“Often when we talk about how to decarbonise, solutions seem out of reach, but this initiative is proof that innovators everywhere can have impact,” added Barbara Humpton, CEO of Siemens USA. “The technology to decarbonise is here and ours is hard at work, helping industries like health care uncover ways to be more efficient and take concrete action to meet their carbon-reduction targets.”

The potential impact of adopting this technique as an industry standard would not impact patient care and would be an effective strategy to reduce cost and carbon emissions in health care, added Woolen.

Measuring Tissue Stiffness with Ultrasound Yields Sharper Images

Researchers have developed a new ultrasound method that for the first time can measure the level of tension in human tissue – a key indicator of disease. The breakthrough, published in the journal Science Advances, could be used to build new ultrasound machines that are able to better discriminate between abnormal tissue, scarring, and cancer.

Images produced by the current techniques ultrasound used in healthcare aren’t usually enough to diagnose whether tissues are abnormal. To improve diagnosis, the researchers developed a way to measure forces such as tension by using an ultrasound machine. Tension is generated in all living tissue, so measuring it can indicate whether tissue is functioning properly or if it’s affected by disease.

The researchers harnessed a technique from a rail project at the University of Sheffield, which uses sound waves to measure tension along railway lines. The technique, used both for rail and medical ultrasound, relies on a simple principle: the greater the tension, the faster sound waves propagate. Using this principle, the researchers developed a method that sends two sound waves in different directions. The tension is then related to the speed of the waves by using mathematical theories developed by the researchers.

Previous ultrasound methods have struggled to show the difference between stiff tissue or tissue under tension. The developed technique is the first capable of measuring tension for any type of soft tissue, and without knowing anything about it. In this new paper, the researchers describe the new method and demonstrate how they used it to measure tension inside a muscle.

Study leader Dr Artur Gower, Lecturer in Dynamics at the University of Sheffield, said: “When you go to the hospital, a doctor might use an ultrasound device to create an image of an organ, such as your liver, or another part of your body, such as the gut, to help them explore what the cause of a problem might be. One of the limitations of ultrasounds used in healthcare now is that the image alone is not enough to diagnose whether any of your tissues are abnormal.

“What we’ve done in our research is develop a new way of using ultrasound to measure the level of tension in tissue. This level of detail can tell us whether tissues are abnormal or if they are affected by scarring or disease. This technique is the first time that ultrasound can be used to measure forces inside tissue, and it could now be used to build new ultrasound machines capable of diagnosing abnormal tissue and disease earlier.”

Source: University of Sheffield

Sex Differences in Brain Glycogen After a Stroke may Yield New Treatments

Credit: American Heart Association

Although males and females are equally impacted by stroke, there are differences in recovery. Since oestrogen and progesterone have known neuroprotective effects, it is important to gauge their effects in stroke recovert. In a paper published in IBRO Neuroscience Reports, researchers have discovered differences between biomarkers such as glycogen levels in the brains of male and female mice.

“A stroke is caused by a loss of blood flow to brain cells. Without urgent intervention this may cause those cells to die because they constantly need energy and nutrients from the blood,” said Prof Nicole Sylvain, clinical research coordinator and lab manager at the University of Saskatchewan.

Sylvain and her colleagues are looking at treatments for post-stroke recovery that help supplement these energy losses. Using the Canadian Light Source (CLS) at the University of Saskatchewan (USask), the team was able to identify energy biomarkers in the brain, which could eventually inform clinicians about the effects of potential stroke treatments on brain recovery after a stroke.

The group’s recent study examined post-stroke differences between male and female mice, and found that female mice have higher amounts of glycogen in their brains. When the supply of glycogen is disrupted by stroke, the brain is severely impacted.

Most pre-clinical stroke research has been performed using male lab animals, with results usually generalised to both sexes. In clinical stoke cases, females have a higher incidence of ischaemic stroke and poorer outcomes, compared to males.

“We found that, for the most part, male data can be generalised for females, however, some of the metabolic markers we measured were actually different,” Sylvain said. “It’s really important to do the research on both sexes.”

It would be impossible for the team to detect the biomarkers without to the Mid-IR beamline.

“The only way to detect them in such an accurate way across the brain is with infrared imaging, so the CLS has been absolutely vital to our research.”

Source: University of Saskatchewan

A Stool Sample Could Detect Some Parkinson’s Cases Early

Old man with magnifying glass
Image by Mar Lezhava on Unsplash

One early indicator of Parkinson’s disease (PD) is isolated REM-sleep behaviour disorder. Researchers have shown that a greater concentration of α-synuclein aggregates can be detected in the stool samples of patients. In the scientific journal npj Parkinson’s Disease, they now present a method for detecting these aggregates.

There are two forms of PD. In 70% of cases, it originates in the central nervous system. However, in around 30% of cases it originates in the nervous system of the intestine (“enteric nervous system”). The latter form is referred to as “body-first Parkinson’s disease” (for short: body-first PD) and the characteristic deposits of aggregates of the body’s own α-synuclein protein are formed in the neurons in the intestine.

A preliminary form of body-first PD is the so-called isolated REM-sleep behaviour disorder (for short: iBRD). It causes in part complex movements during REM-sleep insofar as the patient experiences vivid and disturbing dreams. These movements can endanger the sufferer themselves or others.

A research team headed by Professor Erdem Gültekin Tamgüney from the Institute of Physical Biology at HHU now reports that it is possible to detect an elevated level of α-synuclein aggregates in the stool samples of patients. To achieve this, the team used a new surface-based fluorescence intensity distribution analysis (sFIDA) to detect and quantify individual particles of α-synuclein aggregates.

Professor Tamgüney: “We are the first to prove the presence of α-synuclein aggregates in stool samples. Our results show a significantly higher level of α-synuclein aggregates in iRBD patients compared with healthy individuals or patients with Parkinson’s. These findings could lead to a non-invasive diagnostic tool for prodromal synucleinopathies — including Parkinson’s — which could in turn enable therapies to be initiated at an early stage before symptoms occur.” However, more research is required before the process can find its way into clinical practice, for example investigation into why the level is lower in Parkinson’s patients.

The study was conducted in a collaboration to establish a biobank with stool samples from patients and control subjects, and to develop the test procedure and conduct the tests on the samples, and to eventually commercialise the technique.

Background

In body-first PD, the deposits of fibrils of the body’s own α-synuclein protein, which are characteristic of Parkinson’s, are first formed in the neurons of the enteric nervous system, which serves the gastrointestinal tract. The aggregates then spread to the central nervous system in a way similar to prions, i.e. an existing aggregate combines individual α-synuclein proteins in its vicinity into further aggregates in a nucleation process; these aggregates then spread further through the body.

The influence of what happens in the gastrointestinal tract on the brain is referred to as the “gut-brain axis.” The gastrointestinal tract is exposed to the environment and it is possible that harmful substances such as chemicals, bacteria or viruses ingested directly with food or via interaction with the microbiome of the gastrointestinal tract may trigger the pathological formation of α-synuclein aggregates.

Source: Heinrich-Heine University Duesseldorf

Patients ‘Don’t Need to be Checked for Everything’, Recommendation Says

Blood samples
Photo by National Cancer Institute on Unsplash

Commonly ordered tests can provide early warning of underlying disease, but could also create unnecessary risks of false positive results, provoking anxiety in the patient, wasted time and money and risks of invasive testing.

Therefore, to combat commonly ordered – but not always necessary – procedures and tests, the Society of General Internal Medicine (SGIM) on Tuesday released its revised list of recommendations on five primary care procedures and tests that patients and physicians should question.

Northwestern University’s Dr Jeffrey A. Linder and David Liss, who have previously published research on the benefits of primary care checkups, helped revise the list.

For instance, the age-old idea of getting an annual physical exam with “routine blood tests” from a primary care doctor is a misconception because a person’s age and other risk factors should influence how frequently they should see their doctor, Linder said.

“We often have patients come in asking us to ‘check me for everything,’ but this is a potentially anxiety-provoking, dangerous thing for patients because the more testing we do, the more stuff we find, and the more we need to follow up,” said Linder, chief of the division of general internal medicine at Northwestern University Feinberg School of Medicine and a Northwestern Medicine physician. “In someone who is asymptomatic, an ‘abnormality’ is much more likely to be a false positive or of no clinical significance than for us to catch early disease.

“False positives can expose patients to all of the anxiety, costs, hassle and time commitment, and danger from sometimes invasive testing, with a very low likelihood that it is going to improve their health.”

This isn’t to say nobody should get a checkup every year. For instance, patients who have overdue preventive services, rarely see their primary care physician, have low self-rated health and/or are aged 65 or older should get an annual checkup, the scientists said.

The newly revised list is part of SGIM’s Choosing Wisely campaign, which is an initiative of the American Board of Internal Medicine Foundation. SGIM members originally selected the topics in 2013 and later updated the list in 2017.

The list generated controversy when it was first developed in 2013, recalls Linder.

“The list was widely misinterpreted as ‘specialty society says you don’t need to see your doctor,’ but that was not what it said,” Linder said.

Time and downstream financial costs also are issues of these commonly ordered but oftentimes unnecessary tests and procedures, Liss said.

“Patients and care teams often spend valuable time on low-value checkups that could have been devoted to high-need patients,” said Liss, research associate professor of general internal medicine at Feinberg. “There also is the overall increase in costs to the health system. And even if annual checkups are covered by most insurance, patients often have copays for services like blood draws and other diagnostic tests.”

The revised list was developed after months of careful consideration and review, using the most current evidence about management and treatment options. Linder and Liss served as ad hoc members of the SGIM’s Choosing Wisely Working Group.

Here are the five recommendations, based on a review of the most recent studies in the field:

  1. Don’t recommend daily home glucose monitoring in patients with Type 2 diabetes mellitus not using insulin.
  2. Don’t perform routine annual checkups unless patients are likely to benefit; the frequency of checkups should be based on individual risk factors and preferences. During checkups, don’t conduct comprehensive physical exams or routine lab testing.
  3. Don’t perform routine pre-operative testing before low-risk surgical procedures.
  4. Don’t recommend cancer screening in adults with life expectancy of less than 10 years.
  5. Don’t place, or leave in place, peripherally inserted central catheters for patient or provider convenience.

Source: Northwestern University

A Severe Form of Dementia may in Fact be Caused by a Cerebrospinal Fluid Leak

MRI images of the brain
Photo by Anna Shvets on Pexels

A new study suggests that some patients diagnosed with behavioural-variant frontotemporal dementia (bvFTD) – a presently incurable, mentally debilitating condition – may instead have a cerebrospinal fluid leak, which is detectable on MRI scans and often treatable. The researchers say these findings, published in the peer-reviewed journal Alzheimer’s & Dementia: Translational Research and Clinical Interventionscould lead to a cure.

“Many of these patients experience cognitive, behavioural and personality changes so severe that they are arrested or placed in nursing homes,” said Wouter Schievink, MD, professor of Neurosurgery at Cedars-Sinai. “If they have behavioural-variant frontotemporal dementia with an unknown cause, then no treatment is available. But our study shows that patients with cerebrospinal fluid leaks can be cured if we can find the source of the leak.”

When cerebrospinal fluid (CSF) leaks into the body, the brain can sag, causing dementia symptoms. Schievink said many patients with brain sagging, detectable in MRI, go undiagnosed, and he advises clinicians to take a second look at patients with telltale symptoms.

“A knowledgeable radiologist, neurosurgeon or neurologist should check the patient’s MRI again to make sure there is no evidence for brain sagging,” Schievink said.

Clinicians can also ask about a history of severe headaches that improve when the patient lies down, significant sleepiness even after adequate night-time sleep, and whether the patient has ever been diagnosed with a Chiari brain malformation, a condition in which brain tissue extends into the spinal canal. Brain sagging, Schievink said, is often mistaken for a Chiari malformation.

Even when brain sagging is detected, the source of a CSF leak can be difficult to locate. When the fluid leaks through a tear or cyst in the surrounding membrane, it is visible on CT myelogram imaging with the aid of contrast medium.

Schievink and his team recently discovered an additional cause of CSF leak: the CSF-venous fistula. In these cases, the fluid leaks into a vein, making it difficult to see on a routine CT myelogram. To detect these leaks, technicians must use a specialized CT scan and observe the contrast medium in motion as it flows through the cerebrospinal fluid.

In this study, investigators used this imaging technique on 21 patients with brain sagging and symptoms of bvFTD, and they discovered CSF-venous fistulas in nine of those patients. All nine patients had their fistulas surgically closed, and their brain sagging and accompanying symptoms were completely reversed.

“This is a rapidly evolving field of study, and advances in imaging technology have greatly improved our ability to detect sources of CSF leak, especially CSF-venous fistula,” said Keith L. Black, MD, chair of the department of Neurosurgery at Cedars-Sinai. “This specialised imaging is not widely available, and this study suggests the need for further research to improve detection and cure rates for patients.”

The remaining 12 study participants, whose leaks could not be identified, were treated with nontargeted therapies designed to relieve brain sagging, such as implantable systems for infusing the patient with CSF. However, only three of these patients experienced relief from their symptoms.

“Great efforts need to be made to improve the detection rate of CSF leak in these patients,” Schievink said. “We have developed nontargeted treatments for patients where no leak can be detected, but as our study shows, these treatments are much less effective than targeted, surgical correction of the leak.”

Source: Cedars-Sinai Medical Center

A Quick Scan Can Pinpoint Hypertension-causing Adrenal Nodules

Stethoscope
Photo by Hush Naidoo on Unsplash

Doctors have demonstrated a new type of CT scan that lights up tiny nodules in the adrenal glands which give rise to hypertension in about 5% of hypertensive patients. enabling hypertension to be cured by their removal. The nodules are discovered in about 5% of hypertensive patients.

Published in The Journal of Hypertension, this work solves a 60-year problem of how to detect the hormone-producing nodules without a difficult and failure-prone catheter study that is available in only a few hospitals. The research also found that, when combined with a urine test, the scan detects a group of patients who come off all their blood pressure medicines after treatment.

The study, led by doctors at Queen Mary University of London and Barts Hospital, and Cambridge University Hospital, involved 128 participants for whom hypertension was found to be caused by aldosterone. The scan found that in two thirds of patients with elevated aldosterone secretion, this is coming from a benign nodule in just one of the adrenal glands, which can then be safely removed. The scan uses a very short-acting dose of metomidate, a radioactive dye that sticks only to the aldosterone-producing nodule.

The scan was as accurate as the old catheter test, but quick, painless and technically successful in every patient. Until now, the catheter test was unable to predict which patients would be completely cured of hypertension by surgical removal of the gland. By contrast, the combination of a ‘hot nodule’ on the scan and urine steroid test detected 18 of the 24 patients who achieved a normal blood pressure off all their drugs.

Professor Morris Brown, co-senior author of the study and Professor of Endocrine Hypertension at Queen Mary University of London, said: “These aldosterone-producing nodules are very small and easily overlooked on a regular CT scan. When they glow for a few minutes after our injection, they are revealed as the obvious cause of hypertension, which can often then be cured. Until now, 99% are never diagnosed because of the difficulty and unavailability of tests. Hopefully this is about to change.”

In most people with hypertension, the cause is unknown, and the condition requires life-long treatment by drugs. Previous research by the group at Queen Mary University discovered that in 5–10% of people with hypertension the cause is a gene mutation in the adrenal glands, which results in excessive amounts of the steroid hormone, aldosterone, being produced. Aldosterone causes salt retention, driving up blood pressure. Patients with excessive aldosterone levels in the blood are resistant to treatment with standard antihypertensives, and at increased risk of cardiovascular disease.

Source: Queen Mary University of London

Global Medical Isotope Shortage to Ease with Renewed Production

Radiation warning sign
Photo by Vladyslav Cherkasenko on Unsplash

Amid the ongoing global shortage of medical isotopes, there is at least some good news: two European research reactors have been fired up again and will be delivering molybdenum-99 and iodine-131 isotopes. In addition, a new reactor to produce Mo-99 through a new method has also been completed in the US and is awaiting testing and certification.

Mo-99 is the world’s most important medical diagnostic radioisotope precursor, and is the parent isotope of technetium-99m (Tc-99m). Tc-99m is used in more than 40-million diagnostic procedures each year. The production of this isotope is acutely vulnerable to supply chain disruption and much of the machinery used to produce it is ageing. South African nuclear corporation NTP also produces a small amount of the isotope locally at its Pelindaba facility.

Nuclear Medicine Europe (NMEU) was notified that the LVR-15 reactor resumed operations on Friday morning November 18 and the first irradiated targets from it are being processed today November 23rd. In addition, NMEU was notified that the HFR reactor resumed operations on November 23 and achieved full power operation at 14:30 CET.

The Mo-99 global supply situation will largely return to normal within the next 7-10 days with the I-131 supply situation returning to normal within two weeks, according to NMEU’s predictions. NMEU will provide further communication to the nuclear medicine community as developments warrant.

At the new production facility in the US, the isotope manufacturer NorthStar will produce Mo-99 through a new method, based on irradiation of molybdenum-100 targets using electron accelerators. This will be the first facility in the world to produce commercial-scale Mo-99 using this technology. The facility also includes new, high-capacity equipment for processing and packaging Mo-99 for distribution to radiopharmacies and hospitals.

What’s Really in that Tattoo Ink?

Photo by benjamin lehman on Unsplash

After testing nearly 100 tattoo inks, researchers reported that, even when the ink bottles had ingredient labels, those ingredients listed on them were often inaccurate. The team also detected small particles that could be harmful to cells.

In the US, the Food and Drug Administration regulates tattoo ink, but in South Africa, tattoo ink [PDF] is imported largely unregulated.

The researchers presented their findings at a meeting of the American Chemical Society (ACS). 

“The idea for this project initially came about because I was interested in what happens when laser light is used to remove tattoos,” said lead researcher John Swierk, PhD. “But then I realised that very little is actually known about the composition of tattoo inks, so we started analysing popular brands.”

Tattoo artists interviewed to see what they knew about the inks they use on their customers could quickly identify a brand they preferred, but they didn’t know much about its contents. “Surprisingly, no dye shop makes pigment specific for tattoo ink,” Dr Swierk explained. “Big companies manufacture pigments for everything, such as paint and textiles. These same pigments are used in tattoo inks.” He also notes that tattoo artists must be licensed in the locales where they operate for safety reasons, yet no federal or local agency regulates the contents of the inks themselves.

Tattoo inks are made up of a pigment and a carrier solution. The pigment could be a molecular compound such as a blue pigment; a solid compound such as titanium dioxide, which is white; or a combination of the two compound types such as light blue ink, which contains both the molecular blue pigment and titanium dioxide. The carrier solution transports the pigment to the middle layer of skin and typically helps make the pigment more soluble, sometimes controlling the viscosity of the ink solution and perhaps containing an anti-inflammatory ingredient.

Dr Swierk’s team has been studying the particle size and molecular composition of tattoo pigments. From these analyses, they have confirmed the presence of ingredients that aren’t listed on some labels. For example, in one case ethanol was not listed, but the chemical analysis showed it was present in the ink. The team has also been able to identify what specific pigments are present in some inks.

“Every time we looked at one of the inks, we found something that gave me pause,” Dr Swierk said. “For example, 23 of 56 different inks analysed to date suggest an azo-containing dye is present.” Although many azo pigments are not health concerns when they are chemically intact, bacteria or UV light can degrade them into another nitrogen-based compound that is a potential carcinogen, according to the Joint Research Centre, which provides independent scientific advice to the European Union.

In addition, the team has analysed 16 inks using electron microscopy, and about half contained particles under 100nm. “That’s a concerning size range,” said Dr Swierk. “Particles of this size can get through the cell membrane and potentially cause harm.”

After the researchers run a few more tests and have the data peer reviewed, they will add the information to their website “What’s in My Ink?” “With these data, we want consumers and artists to make informed decisions and understand how accurate the provided information is,” said Dr Swierk.

Source: American Chemical Society

Low Serum Urate Increases Sarcopenia Risk

Blood sample being drawn
Photo by Hush Naidoo Jade Photography on Unsplash

Adults with low blood levels of urate, the end-product of the purine metabolism in humans, may be at higher risk of sarcopenia and may face a higher risk of early death, according to a new study published in Arthritis & Rheumatology.

Whether or nor low serum urate (SU) levels contribute to adverse outcomes has been the subject of controversy.  The study involved 13 979 participants aged 20 years and older, sourced from the National Health and Nutrition Examination Survey from 1999–2006.

Low serum urate concentrations (<2.5 mg/dL in women; <3.5 mg/dL in men) were associated with low lean mass, underweight BMI (<18.5 kg/m2), and higher rates of weight loss. While low SU was associated with increased mortality (61%) before adjusting for body composition, its effect was reduced and non-significant after adjustment for body composition and weight loss.

“These observations support what many have intuited, namely that people with low serum urate levels have higher mortality and worse outcomes not because low urate is bad for health, but rather that low urate levels tend to occur among sicker people, who have lost weight and have adverse body composition,” explained lead author Joshua F. Baker, MD, MSCE, of the University of Pennsylvania. “While this observational study doesn’t disprove a causal association, it does suggest that great care is needed in interpreting epidemiologic associations between urate levels and health outcomes.”

Source: Wiley