Tag: 22/2/22

Injectable Nanoparticles That Could Slow Internal Bleeding

Photo by Camilo Jimenez on Unsplash

Researchers at MIT have found the ideal size for injectable nanoparticles that could slow traumatic internal bleeding, buying more time for a patient to reach a hospital for further treatment.

In a rat study, the researchers showed that polymer nanoparticles particles in an intermediate size range, (about 150nm in diameter) were the most effective at stopping bleeding. These particles also were much less likely to travel to the lungs or other off-target sites, which larger particles often do. The results were published in ACS Nano.

“With nano systems, there is always some accumulation in the liver and the spleen, but we’d like more of the active system to accumulate at the wound than at these filtration sites in the body,” said senior author Paula Hammond, Professor at MIT.

Nanoparticles that can stop bleeding, also called haemostatic nanoparticles, can be made in a variety of ways. One of the most commonly used strategies is to create nanoparticles made of a biocompatible polymer conjugated with a protein or peptide that attracts platelets, the blood cells that initiate blood clotting.

In this study, the researchers used a polymer known as PEG-PLGA, conjugated with a peptide called GRGDS, to make their particles. Most of the previous studies of polymeric particles to stop bleeding have focused on particles ranging in size from 300–500nm. However, few, if any studies have systematically analysed how size affects the function of the nanoparticles.

“We were really trying to look at how the size of the nanoparticle affects its interactions with the wound, which is an area that hasn’t been explored with the polymer nanoparticles used as haemostats before,” Hong says.

Studies in animals have shown that larger nanoparticles can help to stop bleeding, but those particles also tend to accumulate in the lungs, which can cause unwanted clotting there. In the new study, the MIT team analysed a range of nanoparticles, including small (< 100nm), intermediate (140–220nm), and large (500–650nm).

They first analysed the nanoparticles in the lab to see how how they interacted with platelets in various conditions, to see how well platelets bound to them. They found that, flowing through a tube, the smallest particles bound best to platelets, while the largest particles stuck best to surfaces coated with platelets. However, in terms of the ratio particles to platelets, the intermediate-sized particles were the lowest.

“If you attract a bunch of nanoparticles and they end up blocking platelet binding because they clump onto each other, that is not very useful. We want platelets to come in,” said lead author, Celestine Hong, an MIT graduate student. “When we did that experiment, we found that the intermediate particle size was the one that ended up with the greatest platelet content.”

The researchers injected the different size classes of nanoparticles into mice to see how long they would circulate for, and where they would end up in the body. As with previous studies, the largest nanoparticles tended accumulated in the lungs or other off-target sites.

The researchers then used a rat model of internal injury to study which particles would be most effective at stopping bleeding. They found that the intermediate-sized particles appeared to work the best, and that those particles also showed the greatest accumulation rate at the wound site.

“This study suggests that the bigger nanoparticles are not necessarily the system that we want to focus on, and I think that was not clear from the previous work. Being able to turn our attention to this medium-size range can open up some new doors,” Prof Hammond said.

The researchers now hope to test these intermediate-sized particles in larger animal models, to get more information on their safety and the most effective doses. They hope that eventually, such particles could be used as a first line of treatment to stop bleeding from traumatic injuries long enough for a patient to reach the hospital.

Source: Massachusetts Institute of Technology

Vegetable Intake Does Not Reduce Cardiovascular Risk, Study Finds

Photo by Daria Shevtsova from Pexels

A long-term study on almost 400 000 people in the UK finds little or no evidence that differences in the amount of vegetables consumed affects the risk of cardiovascular disease. 

When known socio-economic and lifestyle confounding factors are corrected for, the small apparent positive effect that remains could likely also be explained away by further confounders.

Getting enough vegetables is important for maintaining a balanced diet and avoiding a wide range of diseases. But might a diet rich in vegetables also lower the risk of cardiovascular disease (CVD)? Unfortunately, new results from a powerful, large-scale new study study in Frontiers in Nutrition found no evidence for this.

The notion of CVD risk being lowered by vegetable consumption might seem plausible at first, as their ingredients such as carotenoids and alpha-tocopherol (vitamin E) have properties that could protect against CVD. But so far, prior evidence for an overall effect of vegetable consumption on CVD has been inconsistent.

The study, which drew on UK Biobank data, found a higher consumption of cooked or uncooked vegetables is unlikely to affect the risk of CVD. The study authors also explained how confounding factors might explain previous spurious, positive findings.

“The UK Biobank is a large-scale prospective study on how genetics and environment contribute to the development of the most common and life-threatening diseases. Here we make use of the UK Biobank’s large sample size, long-term follow-up, and detailed information on social and lifestyle factors, to assess reliably the association of vegetable intake with the risk of subsequent CVD,” said Prof Naomi Allen, UK Biobank’s chief scientist and co-author on the study.

The UK Biobank, follows the health of half a million adults in the UK by linking to their healthcare records. Upon their enrolment in 2006-2010, these volunteers were  interviewed about their diet, lifestyle, medical and reproductive history, and other factors.

The researchers used the responses at enrolment of 399 586 participants (of whom 4.5% went on to develop CVD) to questions about their daily average consumption of uncooked versus cooked vegetables. They analysed the association with the risk of hospitalization or death from myocardial infarction, stroke, or major CVD. They controlled for a wide range of possible confounding factors, including socio-economic status, physical activity, and other dietary factors.

Crucially, the researchers also assessed the potential role of ‘residual confounding’, that is, whether unknown additional factors or inaccurate measurement of known factors might lead to a spurious statistical association between CVD risk and vegetable consumption.

The mean daily intake of total vegetables, raw vegetables, and cooked vegetables was 5.0, 2.3, and 2.8 heaped tablespoons per person. The risk of dying from CVD was about 15% lower for those with the highest intake compared to the lowest vegetable intake. However, this effect was greatly weakened when possible confounding factors were taken into account. Controlling for factors such as socio-economic status reduced the predictive statistical power of vegetable intake on CVD by over 80%, suggesting that more precise measures of these confounders would have explained away any residual effect of vegetable intake.

Dr Qi Feng, the study’s lead author, said: “Our large study did not find evidence for a protective effect of vegetable intake on the occurrence of CVD. Instead, our analyses show that the seemingly protective effect of vegetable intake against CVD risk is very likely to be accounted for by bias from residual confounding factors, related to differences in socioeconomic situation and lifestyle.”

The researchers suggest that subsequent studies should further assess whether particular types of vegetables or their method of preparation might affect the risk of CVD.

Source: Frontiers

No Difference Between Paramedics’ Advanced Airway Management Strategies

Source: Mat Napo on Unsplash

Similar outcomes were seen for patients with out-of-hospital cardiac arrest (OHCA) regardless of the advanced airway management strategy used by paramedics, results from the Taiwanese SAVE trial showed.

There was no generally no difference in clinical outcomes between groups that had the initial strategies of endotracheal intubation or supraglottic airway device insertion:

Sustained return of spontaneous circulation (ROSC) two hours after resuscitation: 26.9% vs 25.8%; survival to hospital discharge: 8.5% vs 8.4%; cerebral performance category score ≤ 2: 3.9% vs 4.8%.

Only prehospital ROSC suggested an advantage to standard endotracheal intubation (10.6% vs 6.4%), according to the researchers, whose study was published in JAMA Network Open.

Endotracheal intubation is a difficult procedure to get right. The SAVE paramedics, all experienced in both methods of advanced airway management, employed direct laryngoscopy and achieved a 77% rate of first-attempt airway success with endotracheal intubation (vs 83% with the supraglottic device). Average scene time (18.4 vs 16.9 minutes) and call-to-airway time (15.9 vs 13.9 minutes) were both longer with endotracheal intubation.

“It is unclear whether a stepwise and algorithmic endotracheal intubation training program could reduce the time in the field and the time for advanced airway insertion, and further research is warranted,” the authors said.

For the SAVE trial conducted from 2016 to 2019, researchers randomly split four EMS teams in Taipei into two clusters, each assigned to initial endotracheal intubation or supraglottic i-gel device insertion when responding to OHCAs over a biweekly period. In case the first advanced airway attempt failed, rescue airway management was allowed using a number of techniques.

The 936 OHCA patients in the study had a median age of 77 years, and 60.8% were men.

However, subgroup analysis showed that prehospital ROSC rates favoured endotracheal intubation in patients with nonshockable rhythm, nonpublic collapse, witnessed arrest, call-to-airway time under 14 minutes, and age 77 years or older.

However, different in-hospital management between groups could have affected the results. The two study arms were unequal in size, and the study could have been underpowered because of inaccurate sample size representation at the study outset. However, the researchers lamented that “even if we had realised that the sample size was inadequate at that time, we would not have been able to recruit more cases because of the outbreak of COVID.”

Source: MedPage Today

Political Factors Drove Hydroxychloroquine and Ivermectin COVID Prescriptions

Photo by Andy Feliciotti on Unsplash

Hydroxychloroquine and ivermectin, two COVID treatments that have been shown to be ineffective for those purposes, were more heavily prescribed in the second half of 2020 in parts of the US that voted for the Republican party, according to a new research letter published in JAMA Internal Medicine.

“We’d all like to think of the health care system as basically non-partisan, but the COVID pandemic may have started to chip away at this assumption,” said lead author Michael Barnett, assistant professor of health policy and management.

The study compared prescription rates for hydroxychloroquine and ivermectin with rates for two control medications, methotrexate sodium and albendazole, which are similar drugs but have not been proposed as COVID treatments. Comparing different US counties, researchers looked at deidentified medical claims data from January 2019 through December 2020 from roughly 18.5 million adults as well as census and voting data.

Overall, hydroxychloroquine prescribing volume from June through December 2020 was roughly double what it had been in the previous year, while the volume of ivermectin prescriptions was seven-fold higher in December 2020 than the previous year. In 2019, prescribing of hydroxychloroquine and ivermectin did not differ according to county Republican vote share. However, that changed in 2020.

After June 2020 – coinciding with when the US Food and Drug Administration revoked emergency use authorisation for hydroxychloroquine – prescribing volume for the drug was significantly higher in counties with the highest Republican vote share as compared to counties with the lowest vote share.

As for ivermectin, prescribing volume was significantly higher in the highest versus lowest Republican vote share counties in December 2020 a 964% increase on the overall prescribing volume in 2019. The spike lined up with with a number of key events, such as the mid-November 2020 release of a now-retracted manuscript claiming that the drug was highly effective against COVID, and a widely publicised US Senate hearing in early December that included testimony from a doctor promoting ivermectin as a COVID treatment.

Neither of the control drugs had differences in overall prescribing volume or in prescribing by county Republican vote share.

The authors concluded that the prescribing of hydroxychloroquine and ivermectin may have been influenced by physician or patient political affiliation. “This is the first evidence, to our knowledge, of such a political divide for a basic clinical decision like infection treatment or prevention,” said Barnett.

Source: Harvard T.H. Chan School of Public Health

Endocrine-disrupting Chemicals Present in Many Pregnancies

Photo by Shvets Productions on Pexels

Researchers in Europe have shown that up to 54% of pregnant women in Sweden were exposed to complex mixtures of endocrine-disrupting chemicals disruptive to brain development.

While current risk assessment tackles chemicals and their allowable exposures on an individual basis, these findings show the need to take mixtures into account for future risk assessment approaches. The study was published in Science.

A growing body of evidence has shown that industrially produced chemicals have endocrine disrupting properties and can thus be dangerous to human and animal health and development. A huge number of new compounds is released every year into the environment during the production of plastic derivatives and other goods.

While exposures for individual chemicals falls below thresholds, exposure to the same chemicals in complex mixtures can still impact human health. However, all current exposure thresholds, are based on chemicals being examined individually. Therefore, an alternative strategy needed to be tested, in which the actual mixtures measured in real life exposures could be tested as such in both the epidemiological and experimental setting. The EDC-MixRisk project set out to tackle this unmet need.

“The uniqueness of this comprehensive project is that we have linked population data with experimental studies, and then used this information to develop new methods for risk assessment of chemical mixtures,” said Carl-Gustaf Bornehag, professor at Karlstad University, Project Manager of the SELMA study.

The study was conducted in three steps:

  1. A mixture of chemicals in the blood and urine of pregnant women was identified in the Swedish pregnancy cohort SELMA, associated with delayed language development in children at 30 months. This critical mixture included a number of phthalates, bisphenol A, and perfluorinated chemicals.
  2. Experimental studies uncovered the molecular targets through which human-relevant levels of this mixture disrupted the regulation of endocrine circuits and of genes involved in autism and intellectual disability.
  3. The findings from the experimental studies were used to develop new principles for risk assessment of this mixture.

“It is striking that the findings in the experimental systems well reflected what we found in the epidemiological part, and that the effects could be demonstrated at normal exposure levels for humans,” said Joëlle Rüegg, professor of environmental toxicology at Uppsala University.

“Human brain organoids (advanced in vitro cultures that reproduce salient aspects of human brain development) afforded, for the first time, the opportunity to directly probe the molecular effects of this mixture on human brain tissue at stages matching those measured during pregnancy. Alongside other experimental systems and computational methods, we found that the mixture disrupts the regulation of genes linked to autism (one of whose hallmarks is language impairment), hinders the differentiation of neurons and alters thyroid hormone function in neural tissue,” said Giuseppe Testa, principal investigator of the EDC-MixRisk responsible for the human experimental modelling.

“One of the key hormonal pathways affected was thyroid hormone. Optimal levels of maternal thyroid hormone are needed in early pregnancy for brain growth and development, so it’s not surprising that there is an association with language delay as a function of prenatal exposure,” said Barbara Demeneix, professor of physiology and endocrinology at the Natural History Museum in Paris.

By combining these techniques, the researchers were able to show that 54% of children included in the SELMA study were at risk of delayed language development (at age 30 months) as they were prenatally exposed to a mixture of chemicals at levels that were above the levels predicted to impact neurodevelopment. Yet this risk fell below the exposure limits for individual chemicals.

Source: EURION Cluster