Tag: lab tests

Gravity-powered Biomedical Devices Pull Droplets Through a Maze

Source: Unsplash CC0

Biomedical engineers at Duke University have developed an entirely new approach to building point-of-care diagnostic devices that only use gravity to transport, mix and otherwise manipulate the liquid droplets involved. The demonstration, in the journal Device, requires only commercially available materials and very little power to read results, making it a potentially attractive option for applications in low-resource settings.

“The elegance in this approach is all in its simplicity – you can use whatever tools you happen to have to make it work,” said Hamed Vahabi, a former postdoctoral researcher at Duke. “You could theoretically even just use a handsaw and cut the channels needed for the test into a piece of wood.”

The study was conducted in the laboratory of Ashutosh Chilkoti, the Alan L. Kaganov Distinguished Professor of Biomedical Engineering at Duke.

There is no shortage of need for simple, easy-to-use, point-of-care devices. Many demonstrations and commercial devices seek to make diagnoses or measure important biomarkers using only a few drops of liquid with as little power and expertise required as possible. Their goal is to improve health care for the billions of people living in low-resource settings far from traditional hospitals and trained clinicians.

All of these tests have the same basic requirements; they must move, mix and measure small droplets containing biological samples and the active ingredients that make measuring specific biomarkers possible. More expensive examples use tiny electrical pumps to drive these reactions. Others use the physics of liquids within microchannels (microfluidics) that create a sort of suction effect.

This is the first demonstration that only uses gravity. Each approach offers uniquely useful abilities as well as drawbacks.

“Most microfluidic devices need more than just capillary forces to operate,” Chilkoti said. “This approach is much simpler and also allows very complex fluid paths to be deigned and operated, which is not easy or cheap to do with microfluidics.”

The new gravity-driven approach relies on a set of nine commercially available surface coatings that can tweak the wettability and slipperiness at any given point on the device. That is, they can adjust how much droplets flatten down into pancakes or remain spherical while making it easier or harder for them to slide down an incline.

Used together in clever combinations, these surface coatings can create all the microfluidic elements needed in a point-of-care test. For example, if a given location is extremely slippery and a droplet is placed at an intersection where one side pulls liquid flat and the other pushes it into a ball, it will act like a pump and accelerate the droplet toward the former.

“We came up with many different elements to control the motion, interaction, timing and sequence of multiple droplets in the device,” Vahabi said. “All of these phenomena are well-known in the field, but nobody thought of using them to control the motion of droplets in a systematic way before.”

By combining these elements, the researchers created a prototype test to measure the levels of lactate dehydrogenase (LDH) in a sample of human serum. They carved channels within the test platform to create specific pathways for droplets to travel, each coated with a substance that stops the droplets from sticking along their journey. They also primed specific locations with dried reagents needed for the test, which are soaked up by droplets of simple buffer solution as they travel through.

The whole maze-like test is then capped with a lid containing a couple of holes where the sample and buffer solution are dripped in. Once loaded, the test is placed inside a box-like device with a handle that turns the test 90° to allow gravity to do its work. This device is also equipped with a simple LED and light detector that can quickly and easily detect the amount of blue, red, or green in the test results. This means that the researchers can tag three different biomarkers with different colours for various tests to measure.

In the case of this prototype LDH test, the biomarker is tagged with a blue molecule. A simple microcontroller measures how deep of a blue hue the test results become and how quickly it changes colour, which indicates the amount and concentration of LDH in the sample, to generate results.

“We could eventually also use a smart phone down the line to measure results, but that’s not something we explored in this specific paper,” said Jason Liu, a PhD candidate in the Chilkoti lab.

The demonstration provides a new approach for consideration when engineering inexpensive, low-power, point-of-care diagnostic devices. While the group plans to continue developing their idea, they also hope others will take notice and work on similar tests.

“While a well-designed microfluidic system can be fully automated and easy-to-use by passive means, the timing of discrete steps is usually programmed into the design of the device itself, making modifications to protocol more difficult,” added David Kinnamon, a PhD candidate in the Chilkoti group. “In this work, the user retains more control of the timing of steps while only modestly sacrificing ease-of-operation. Again, this is an advantage for more complex protocols.”

Source: Duke University

Low Serum Urate Increases Sarcopenia Risk

Blood sample being drawn
Photo by Hush Naidoo Jade Photography on Unsplash

Adults with low blood levels of urate, the end-product of the purine metabolism in humans, may be at higher risk of sarcopenia and may face a higher risk of early death, according to a new study published in Arthritis & Rheumatology.

Whether or nor low serum urate (SU) levels contribute to adverse outcomes has been the subject of controversy.  The study involved 13 979 participants aged 20 years and older, sourced from the National Health and Nutrition Examination Survey from 1999–2006.

Low serum urate concentrations (<2.5 mg/dL in women; <3.5 mg/dL in men) were associated with low lean mass, underweight BMI (<18.5 kg/m2), and higher rates of weight loss. While low SU was associated with increased mortality (61%) before adjusting for body composition, its effect was reduced and non-significant after adjustment for body composition and weight loss.

“These observations support what many have intuited, namely that people with low serum urate levels have higher mortality and worse outcomes not because low urate is bad for health, but rather that low urate levels tend to occur among sicker people, who have lost weight and have adverse body composition,” explained lead author Joshua F. Baker, MD, MSCE, of the University of Pennsylvania. “While this observational study doesn’t disprove a causal association, it does suggest that great care is needed in interpreting epidemiologic associations between urate levels and health outcomes.”

Source: Wiley

New Biosensor Rapidly Measures ATP and Lactate in Blood Samples

The prototype of the ATP and lactate sensor developed in the study (left); and the integrated sensor chip that detects ATP and lactate levels (right). Credit: Akihiko Ishida, Hokkaido University

Scientists at Hokkaido University have developed a prototype sensor that could help doctors rapidly measures levels of adenosine triphosphate (ATP) and lactate in blood samples from patients, aiding in the rapid assessment of the severity of conditions such as sepsis.

The scientists detailed their prototype biosensor in the journal Biosensors and Bioelectronics.

ATP is a molecule found in every living cell that stores and carries energy. In red blood cells, ATP is produced by a biochemical pathway called the Embden–Meyerhof pathway. Severe illnesses such as multiple organ failure, sepsis and influenza reduce the amounts of ATP produced by red blood cells.

As such, the severity of these illnesses could be gauged by monitoring the amounts of ATP and lactates in a patient’s blood. “In 2013, our co-authors at Tokushima University proposed the ATP-lactate energy risk score (A-LES) for measuring ATP and lactate blood levels to assess acute influenza severity in patients,” explained Akihiko Ishida, an applied chemist at Hokkaido University. “However, current methods to measure these levels and other approaches for measuring disease severity can be cumbersome, lengthy or not sensitive enough. We wanted to develop a rapid, sensitive test to help doctors better triage their patients.”

The researchers developed a biosensor that can detect levels of ATP and lactate in blood with great high sensitivity in as little as five minutes. The process is straightforward. Chemicals are added to a blood sample to extract ATP from red blood cells. Enzymes and substrates are then added to convert ATP and lactate to the same product that can be detected by specially modified electrodes on a sensor chip; the amount of by-product present in the sample increases the electrical current measured.

Schematic representation of the proposed sensor for sequentially detecting ATP and lactate levels in the blood. Through a series of chemical reactions, ATP and lactate are converted to hydrogen peroxide, the breakdown of which to water H2O causes the sensor chip to generate a signal that is detected by the sensor.

The team conducted parallel tests and found that other components present in blood, such as ascorbic acid, pyruvic acid, adenosine diphosphate (ADP), urate and potassium ions, don’t interfere with the ability of the electrodes to accurately detect ATP and lactate. They also compared their sensor with those currently available and found it allowed for the relatively simple and rapid measurement of the two molecules.

“We hope our sensor will enable disease severity monitoring and serve as a tool for diagnosing and treating patients admitted to intensive care units,” said Ishida.

The researchers plan to further simplify the measurement process by integrating an ATP extraction method into the chip itself, as well as reducing the size of the sensor system.

Source: Hokkaido University

Lab Results are Influenced by Ambient Daily Temperatures

Photo by Louis Reed on Unsplash

Ambient temperature influences many common lab tests, and these distortions likely affect medical decision making, such as whether to prescribe medications, according to new research published in the journal Med

To account for this, the researchers suggest that laboratories could statistically adjust for ambient temperature on test days when reporting lab results.

“When a doctor orders a laboratory test, she uses it to shed light on what’s going on inside your body, but we wondered if the results of those tests could also reflect something that’s going on outside of your body” said study co-author Ziad Obermeyer of the University of California, Berkeley. “This is exactly the kind of pattern that doctors might miss. We’re not looking for it, and lab tests are noisy.”

Delving into this problem, Obermeyer and Devin Pope of the University of Chicago analysed a large dataset of test results from different climates. In a sample of more than four million patients, they modelled more than two million test results based on temperature. They measured how day-to-day temperature fluctuations influenced results, over and above the patients’ average values, and seasonal variation.

Temperature was found to affect more than 90% of individual tests and 51 of 75 assays, including measures of kidney function, cellular blood components, and lipids such as cholesterol and triglycerides. “It’s important to note that these changes were small: less than one percent differences in most tests under normal temperature conditions,” Obermeyer said.

These small fluctuations did not likely reflect long-term physiological trends. For example, lipid panels checked on cooler days appeared to suggest a lower cardiovascular risk, resulting in almost 10% fewer prescriptions for cholesterol-lowering drugs called statins to patients tested on the coolest days compared to the warmest days, despite the results likely not reflecting stable changes in cardiovascular risk.

Since the study wasn’t an experiment, the exact mechanisms underlying the fluctuations in lab results could not be pinpointed. However, blood volume, specific assay performance, specimen transport, or changes in lab equipment might explain them. “Whatever their cause, temperature produces undesirable variability in at least some tests, which in turn leads to distortions in important medical decisions,” Pope said.

Laboratories could get around this by statistically adjusting for ambient temperature on the test day when reporting lab results. This could be a way to reduce weather-related variability without expensive temperature control equipment. 

In practice, decisions on adjustment would need to be at the discretion of the laboratory staff and the treating physician, potentially on a case-by-case basis.

According to the authors, the study may also have broader clinical implications. “The textbook way of thinking about medical research is bench to bedside. First, we come up with a hypothesis, based on theory, then we test it with data,” Obermeyer said. “As more and more big data comes online, like the massive dataset of lab tests we used, we can flip that process on its head: discover fascinating new patterns and then use bench science to get to the bottom of it. I think this bedside-to-bench model is just as important as its better-known cousin because it can open up totally new questions in human physiology.”

Source: Science Daily

New HPV Test Enables Precision Treatment

Source: NCI on Unsplash

Researchers have made advances in improving detection of the human papillomavirus (HPV) in the bloodstream, which could further hone precision treatment of the illness.

The team sequenced circulating tumour DNA, which can lead to the detection of HPV in a person’s blood. Previous science in the field has proven that the virus, which causes cancers in the throat, mouth, and genital areas, can be found in the bloodstream but tests have had limited sensitivity. The new study enables ‘ultrasensitive’ detection, which could pave the way toward greater use of precision medicine for patients with cancers affecting these vulnerable areas of the body.

In a cohort of patients with advanced cervix cancer, the new sequencing method detected 20-fold lower levels of HPV circulating tumour DNA, making it a promising new method to monitor the disease.

The results come from the laboratory of Senior Scientist Dr Scott Bratman at Princess Margaret Cancer Center and are published in Clinical Cancer Research. “Increasingly, as clinicians we’re focused on precision medicine and making sure we’re not over-treating people while still curing them, that’s a very difficult balance to strike,” Dr Bratman said.

One way is to use liquid biopsy approaches or blood-based biomarkers, such as circulating tumour DNA, in order to monitor how the treatment is progressing, he added.

“We’re really at the cusp of a revolution from a technology, clinical implementation and standard of care standpoint, where five to 10 years from now we will not be treating everybody with the same dose of radiation and chemotherapy, and then waiting months to see if the treatment was effective,” he said. “I’m confident we will be giving much more tailored doses.”

When physicians scale back on these treatments, there is a risk of the cancer reoccurring. With more sensitive tests, reoccurrences can be detected early and patients returned to treatment.

“Patients who need more treatment will then be able to continue on, or different treatments can be added,” Dr Bratman said. “We can spare the vast majority of patients who will not need those interventions and provided them with a greater quality of life once they’re cured of the cancer.”

The work will enable further study in the field, refining the approach using larger study groups, and eventually, practice-changing clinical trials. This technique could also be applied to other cancer-causing viruses such as certain types of stomach cancer and lymphomas.

Source: Princess Margaret Cancer Center

New Test Makes Prostate Cancer Screening More Affordable

Photo by mari lezhava on Unsplash

Researchers have found that, coupled with MRI, the novel Stockholm3 blood test, could greatly cut overdiagnoses and thereby improve prostate cancer screening. The same research group previously showed that Magnetic resonance imaging (MRI) could also reduce overdiagnoses, and the Stockholm3 test can reduce the number of MRIs performed by a third while further preventing the detection of minor, low-risk tumours.

The research group published the findings of their study in The Lancet Oncology,

“Overall, our studies show that we have identified the tools needed to be able to carry out effective and safe screening for prostate cancer. After many years of debate and research, it feels fantastic to be able to present knowledge that can improve healthcare for men,” said Tobias Nordström, associate professor of urology at the Department of Clinical Sciences, Danderyd Hospital at Karolinska Institutet, who is responsible for the STHLM3MRI study.

The disease is currently screened for by using PSA (prostate-specific antigen) tests combined with traditional biopsies, result in unnecessary biopsies and overdiagnosis from detection of numerous minor, low-risk tumours. As a result of these costs outweighing benefits, no country save Lithuania has implemented nationwide screening programmes.

Results from the STHLM3MRI study published in NEJM indicated that overdiagnosis could be reduced by substituting traditional prostate biopsies with magnetic resonance imaging (MRI) and targeted biopsies. The new results, now published in y, show that the addition of the Stockholm3 test, which was developed by researchers at Karolinska Institutet, can be an important complement. It is a blood test that uses an algorithm to analyse a combination of protein markers, genetic markers and clinical data.

Fewer biopsies needed
“The availability of MRI in healthcare will be a limiting factor. We now show that a novel blood test as adjunct to MRI can reduce the number of MRIs performed by a third. Compared with traditional screening, overdiagnosis is reduced by as much as 69 percent. At the same time, the number of biopsies is halved, while we can find just as many clinically significant tumours,” said Martin Eklund, associate professor at the Department of Medical Epidemiology and Biostatistics, Karolinska Institutet.

In the STHLM3MRI study, 12 750 male participants provided an initial blood sample for PSA analysis and analysis using the new Stockholm3 test. Men with test results showing elevated PSA levels were then randomly selected for traditional biopsies or MRI. In the MRI group, biopsies were conducted strictly on suspected tumours identified by MRI.

“Separate use of the Stockholm3 test and MRI has previously been shown to be cost-effective. We have now analysed the cost-effectiveness when these tools are combined and will shortly report exciting results from that analysis,” Tobias Nordström concluded.

Source: Karolinska Institute

‘Vast Majority’ of Urine Tests Before Planned Surgery Unnecessary

Source: Vidal Balielo Jr on Pexels

“The vast majority” of urine tests conducted prior to scheduled surgeries to check for infections “were not plausibly indicated,” according to US researchers in a study of claims data.

Though the individual tests were inexpensive at $17 each, over the study’s 11-year duration they came to $50 million, plus another $5 million for antibiotics prescribed to patients with no clinical signs of infection.

“Patients and society bear the risk of inappropriate antibiotic use, which can result in adverse drug reactions, increased risk of infections such as Clostridioides difficile, and emergence of antibiotic resistance,” wrote authors Erica Shenoy, MD, PhD, of Massachusetts General Hospital in Boston, and two colleagues in a JAMA Internal Medicine research letter, published in the journal’s ‘Less Is More’ series which highlights overused tests and treatments.

Once, preprocedural urinalyses were routinely done to check for infections that could increase complication risk. However studies have since shown that such testing rarely improves outcomes or even changes clinical management. Organisations such as the Infectious Diseases Society of America and the US Preventive Services Task Force have recommended against testing and prescribing for asymptomatic infections except in certain narrow indications.

To see just how common the practice has been, the researchers used data on some 13 million procedures performed from 2007 to 2017 from Medicare and the IBM Watson Marketscan database of commercial insurance claims, spanning 14 specialties. The researchers did not count kidney and urological surgeries since urinalysis is recommended by guidelines for most such procedures.

Urinalysis was deemed appropriate for the others when urinary tract symptoms, fever, or altered mental state was mentioned. Without those codes, the procedures were “not plausibly indicated.”

While 75% of surgeries in the data did not involve preprocedural urinalysis, suggesting good adherence, in the 25% that did, fully 89% across all types of surgery had no apparent indication; with the lowest non-indicated testing rate being 84%.

The results show that traditional practice patterns “remain entrenched”, according to the researchers, who called on insurers to take more steps to be more aggressive in denying claims for unneeded testing.

Limitations included incomplete patient data as patients may have had legitimate indications for testing and antibiotic prescriptions that were not recorded with the relevant diagnostic codes. Also, about half of the 11-year study period preceded the movement to limit ‘low-value’ testing.

Source: MedPage Today

Protein Markers Distinguish Between Stable and Progressive Leukaemia

Photo by Louis Reed on Unsplash

Scientists have identified protein markers which are related to the most common form of leukaemia.

Chronic lymphocytic leukaemia (CLL) is the most common leukaemia in the Western world. A new study published in the Journal of Leukocyte Biology shows that certain protein markers may indicate which patients have stable forms of CLL and which have more aggressive types.

Identifying these proteins may not only help determine patients’ prognoses but also point to potential therapeutic targets for investigators who are searching for new CLL treatments.

The study examined CLL B-cells’ proteomic profile from untreated CLL patients to see which biologic processes  that are affected early on and during disease evolution as stable or progressive. Of the 11 patients included in the study, six evolved to either progressive and five to stable disease. Purified B cells from the  patients were tested at two time points by liquid chromatography–tandem mass spectrometry. 

First, at an early stage of the disease (Binet stage A), based on the relative abundance levels of 389 differentially expressed protein, samples were separated into stable and progressive clusters with the main differentiating factor being the RNA splicing pathway.

An RNA-Seq study was conducted which showed 4217 differentially spliced genes between the two clusters. Distinct longitudinal evolutions were observed with predominantly proteomic modifications in the stable CLL group and spliced genes in the progressive CLL group. Splicing events were shown to be six times more frequent in the progressive CLL group. 

The main aberrant biologic processes controlled by DEPs and spliced genes in the progressive group were cytoskeletal organisation, Wnt/β-catenin signaling, and mitochondrial and inositol phosphate metabolism with a downstream impact on CLL B-cell survival and migration. 

The study suggests that proteomic profiles of early stage CLL can discriminate progressive from stable disease. Furthermore, it appears RNA splicing dysregulation underlies CLL evolution, opening new avenues for biomarkers and therapy.

“The results offer a meaningful biological approach into the protein composition of CLL cells at an early stage of the disease, when the clinical characteristics of patients are similar and the course of the disease is difficult to predict. Our results showed that the protein profile can however predict how the disease will further evolve,” said lead author Cristina Bagacean, PhD, of CHU de Brest, in France. “This approach could identify putative therapeutic targets in order to prevent CLL progression.”

Source: Wiley

Journal information: Bagacean, C., et al. (2021) Identification of altered cell signaling pathways using proteomic profiling in stable and progressive chronic lymphocytic leukemia. Journal of Leukocyte Biology. doi.org/10.1002/JLB.4HI0620-392R.

Chemical Fingerprints Improve Stem Cell Production

Photo by Louis Reed on Unsplash

Researchers in Japan have developed a new, noninvasive way to monitor the tricky art of stem cell production.

The current era of ethical stem cell research was ushered in by the 2012 Nobel prize-winning discovery that ordinary cells could be coaxed to revert to their earliest pluripotent stage ushered in. Suddenly, scientists could have an ethical, near-inexhaustible supply of pluripotent stem cells — the most versatile of stem cells — that can become any type of cell much like how embryonic stem cells function.

These reprogrammed cells called induced pluripotent stem cells (or iPS cells) hold great promise for regenerative medicine, where they can be used to develop tissue or organ replacement-based treatments for life-threatening diseases.

One key challenge is that it is a lengthy and delicate process to artificially induce ordinary cells to reset back to pluripotency. Obtaining iPS cells therefore is a matter of chance. However, knowing all they can about the complex chemical changes happening inside during reprogramming can help scientists increase the chances of successfully obtaining viable iPS cells for clinical applications. Current methods that track reprogramming status, however, use destructive and costly techniques.

A study led by Dr Tomonobu Watanabe, professor at Hiroshima University’s Research Institute for Radiation Biology and Medicine, showed that Raman spectroscopy could be a low-cost, simpler, and non-intrusive technique to monitor the cell’s internal environment as it transitions.

Dr Watanabe explained: “The quality evaluation and sorting of existing cells have been carried out by investigating the presence or absence of expression of surface marker genes. However, since this method requires a fluorescent antibody, it is expensive and causes a problem of bringing the antibody into the cells.”

He added that the “solution of these problems can accelerate the spread of safe and low-cost regenerative medicine using artificial tissues. Through our method, we provide a technique for evaluating and sorting the quality of iPS cells inexpensively and safely, based on scattering spectroscopy.”

Raman spectroscopy is an alternative to invasive approaches that require dyes or labels to extract biochemical information. It instead makes use of vibration signatures produced when light beams interact with chemical bonds in the cell. Since each chemical has its own distinct vibration frequency, scientists can use it to identify the cell’s molecular makeup.

The team used this spectroscopic technique to get the “chemical fingerprints” of mouse embryonic stem cells, the neuronal cells they specialised into, and the iPS cells formed from those neuronal cells. These data were then used to train an AI model to can track the reprogramming is progressing, and verify iPS cell quality by checking for a “fingerprint” match with the embryonic stem cell.

To measure the progress, they assigned the “chemical fingerprint” of neuronal cells as the transformation starting point and the embryonic stem cell’s patterns as the desired end goal. Along the axis, they used “fingerprint” samples collected on days 5, 10, and 20 of the neuronal cells’ reprogramming as reference points on how the process is advancing.

“The Raman scattering spectrum contains comprehensive information on molecular vibrations, and the amount of information may be sufficient to define cells. If so, unlike gene profiling, it allows for a more expressive definition of cell function,” Dr Watanabe said.

“We aim to study stem cells from a different perspective than traditional life sciences.”

Source: Hiroshima University

Journal information: Germond, A., et al. (2020) Following Embryonic Stem Cells, Their Differentiated Progeny, and Cell-State Changes During iPS Reprogramming by Raman Spectroscopy. Analytical Chemistry doi.org/10.1021/acs.analchem.0c01800.

T-Cells Could Identify ‘The Bends’ in Divers

Photo by USGS on Unsplash

A new study investigated genetic changes that occur in a serious condition affecting scuba divers — ‘the bends’ — and found that inflammatory genes and white blood cell activity are upregulated. The findings could lead to biomarkers that will help doctors to diagnose the condition more precisely.

The bends, more formally known as decompression sickness, is a potentially lethal condition that can affect divers. Symptoms include joint pain, a skin rash, and visual disturbances. In some patients, the condition can be severe, potentially leading to paralysis and death. The bends can also affect people working in submarines, flying in unpressurised aircraft or in spacewalks.

It has been studied for a long time: a 1908 paper correctly hypothesised that it involves bubbles of gas forming in the blood and tissue due to pressure decrease. Yet even after a century the precise mechanisms underlying the condition are not well understood. Animal studies have suggested that inflammatory processes may have a role in decompression sickness, but no-one had studied this in humans.

Nowadays, getting ‘the bends’ is rare as divers have well-established methods to mitigate risk, such as controlled ascents from the depths. Nevertheless, doctors have no means to test for the condition, if they do encounter it, and instead rely on observing symptoms and seeing whether patients respond to hyperbaric oxygen therapy.

To investigate decompression sickness, the researchers sampled the blood of divers who had been diagnosed with decompression sickness and also divers who had completed a dive without it. The blood samples were drawn at two times: within 8 hours of the divers emerging from the water, and 48 hours afterwards, when those divers with decompression sickness had undergone hyperbaric oxygen treatment. RNA sequencing analysis was done to measure gene expression changes in white blood cells.

“We showed that decompression sickness activates genes involved in white blood cell activity, inflammation and the generation of inflammatory proteins called cytokines,” explained Dr Nikolai Pace of the University of Malta, a researcher involved in the study. “Basically, decompression sickness activates some of the most primitive body defense mechanisms that are carried out by certain white blood cells.”

These genetic changes had diminished in samples from 48 hours after the dive, after the patients had been treated with hyperbaric oxygen therapy — an interesting finding. The results provide a first step towards a diagnostic test for decompression sickness, and may also reveal new treatment targets.

“We hope that our findings can aid the development of a blood-based biomarker test for human decompression sickness that can facilitate diagnosis or monitoring of treatment response,” said Prof Ingrid Eftedal of the Norwegian University of Science and Technology, who was also involved in the project. “This will require further evaluation and replication in larger groups of patients.”

Source: EurekaAlert!

Journal information: “Acute effects on the human peripheral blood transcriptome of decompression sickness secondary to scuba diving” Frontiers in Physiology, DOI: 10.3389/fphys.2021.660402