Month: February 2022

An Estimated 70% of South Africans Have Had COVID

Image by Quicknews

Writing for GroundUpDr Alex Welte unpacks the results of the latest blood donor survey, which suggests that some 70% of South Africans have had a COVID infection.

The South African National Blood Service (which handles the blood supply for eight provinces) and the Western Cape Blood Service have been testing some donors for Covid antibodies over the last year or so. This has contributed to our understanding of how many people have been infected by SARS-CoV-2 (the virus that causes Covid), and what proportion of infections lead to death. It may help us plan for future waves, though exactly how is complicated.

On the assumption that another wave towards the end of 2021 was nearly inevitable – but before we all heard about omicron – it was decided to perform more such testing in early November. The numbers are now out.

The headline results are:

  • Overall about 80% of black donors had previously had Covid, and 40% of white donors.
  • There is no meaningful variation between age groups and sexes.
  • This latest survey did not include Western Cape data.
  • The test used does not detect the antibodies produced in response to vaccination, so this really is an estimate of people who have been infected.

While blood donors are not perfectly representative of the country’s population, we can take into account differences between the racial breakdown of the donor population and the racial breakdown of the general population. This means that our face-value national estimate is that about 70% of people had been infected before the omicron wave hit.

Since then we’ve had the omicron wave. We would very much like to know how many people are infected now, but there’s really no simple way to derive this number. Researchers are now updating their models with this additional piece of data, and we may see some estimates soon.

With that caution, here is my back-of-the-envelope estimate:

  • Omicron seems to have little trouble infecting people who have been infected by other variants, though there is some protection from prior infection and vaccination.
  • By late last year, quite a bit more than half the population had already had a prior infection.
  • Hence, I estimate that about half of the omicron wave infections were in previously uninfected individuals.
  • Given the infection detection rate estimates from previous waves, and a number of plausible sources of possible variation in this rate, I estimate the detection rate at about 1 in 10.
  • Given the roughly 700 000 cases reported between mid November and mid February, we get an estimate of 7 million cases, and therefore 3.5 million new infections.
  • Given our population of about 60 million, this is roughly an additional 6%.
  • Bottom line: it’s not crazy to estimate that about three-quarters of South Africans have by now been infected. But I would not be surprised if serious models come up with even higher estimates.

A troubling result of the survey is that once more it shows the serious racial disparities in South Africa. I don’t know if this carried over to the omicron wave. Estimating the racial breakdown of infection after omicron depends in a complicated way on variations in housing, lifestyle, access to vaccination, and all the usual factors that shape daily life in our country.

Dr Welte helped design and implement the blood donor survey.

Source: GroundUp

Peptide Discovery Could Halt Nerve Degeneration

A healthy neuron.
A healthy neuron. Credit: NIH

Promising results have been found in the quest for a treatment to halt nerve cell degeneration in disorders like Parkinson’s disease, by preventing their mitochondria from breaking apart with a particular peptide.

The research, published in Brain, examined how the long axons that carry messages between nerve cells in the brain can break down, which causes increasingly worse tightening of the leg muscles, leading to imbalance and eventually paralysis, in addition to other symptoms.

Animal studies have shown it may be a problem with the mitochondria that leads to the axons breaking down or not growing long enough. Since studying human nerve cells is difficult, the researchers made use of human stem cells they modified to become nerve cells with the genetic disorder for a particular type of hereditary spastic paraplegia.

“What we found was that the mitochondria in these cells were breaking apart, what we call mitochondrial fission, and that caused the axons to be shorter and less effective at carrying messages to the brain,” study leader Prof Xue-Jun Li said. “We then looked at whether a particular agent would change the way the nerve cells function — and it did. It inhibited the mitochondrial fission and let the nerve cells grow normally and also stopped further damage.”

What this means for the thousands of people affected by this type of genetic disorder is that this peptide could prove to be useful for a drug or other therapy to stop the nerve cells from becoming damaged or possibly even reverse the course of the damage. Additionally, gene therapy could also prevent mitochondrial damage, the researchers suggested, which would provide another strategy to reverse the nerve damage.

Source: University of Illinois Chicago

Recipients of Bionic Eyes Blindsided by Obsolescence

Source: Daniil Kuzelev on Unsplash

After the manufacturer of a bionic eye ended support, hundreds of recipients of the vision-improving implants have been left without support – “literally in the dark”, as one of them put it.

IEEE Spectrum, which first broke the story, reported that Second Sight discontinued its retinal implants in 2019. The retinal implants serve as the source of artificial vision for the users.

The publication wrote that the firm’s focus is currently on developing a brain implant known as the Orion, which also provides artificial vision. However, it only offers very limited support to the 350 or so who have the now-obsolete Argus II implants.

The system consists of a camera mounted on glasses worn by the user, which transmits video to a video processing unit (VPU), which then encodes the images into arrays of black and white pixels. The VPU then relays the pixel to an electrode array behind the retina, which creates flashes of light corresponding to the white pixels. The technology has had a long and costly road from experiment to product, starting with a lab experiment in the 1990s where stimulation of a single electrode in the retina was discovered to create a visible flash of light perceived by a blind patient. It is hugely expensive, with an estimated cost of $150,000 (R2.25 million) even before the surgery and post-surgery training. 

Implantation surgery typically takes a few hours, followed by training to help users interpret the new optical input from their implants. It is not a replacement for sight; rather it is more like a new sense. Users of the system see fleeting changes of grey which some can then use to assist with basic locomotion. However, the technology is still crude and not all benefit to the same degree. While some can make out the stripes on a pedestrian crossing, others never achieve that level of ability.

The technology also comes with some risk. In the postapproval period, 17% experienced adverse events, though this was an improvement over the 40% in the preapproval period. Since the implant can interfere with MRI scans, some have had to consider removal.

IEEE Spectrum contacted a number of patients, who voiced concern over their future. One patient, Ross Doer, said he was delighted when Second Sight told him in 2020 he was eligible for software upgrades. Yet, he heard troubling rumours. When he called his Second Sight vision-rehab therapist, “She said, ‘Well, funny you should call. We all just got laid off,’ ” he recalled. “She said, ‘By the way, you’re not getting your upgrades.’ ”

“Those of us with this implant are figuratively and literally in the dark,” he said.

Second Sight, when contacted by the publication, said that it had to reduce its workforce because of financial difficulties, and though it attempted to provide “virtual support” was unable to assist with repairs or replacements.
Benjamin Spencer, one of the six patients to receive the new Orion implant, said that it was “amazing” and he was able to see his wife for the first time. But knowing what he does now about Second Sight makes him apprehensive, and plans to have his implant removed at the end of the study period.

Speaking to the BBC, Elizabeth M. Renieris, professor of technology ethics at the University of Notre Dame, in the US, described the development as a cautionary tale.

“This is a prime example of our increasing vulnerability in the face of high-tech, smart and connected devices which are proliferating in the healthcare and biomedical sectors,” she said.

“These are not like off-the-shelf products or services that we can actually own or control. Instead we are dependent on software upgrades, proprietary methods and parts, and the commercial drivers and success or failure of for-profit ventures.”

She added that in future, ethical considerations concerning such technology should include “autonomy, dignity, and accountability”.

Source: IEEE Spectrum

New Blood Thinners from Tick Saliva

Source: Wikimedia CC0

Researchers looking for new anti-clotting drugs have discovered a unique class of medications that act as blood thinners by inhibiting an enzyme in the genes of tick saliva.

The study, published in Nature Communications, focused on novel direct thrombin inhibitors (DTI) from tick salivary transcriptomes, or messenger RNA molecules expressed by an organism. As a result of  the research, there are now new anticoagulant medications that can be developed for the treatment of patients with a variety of coronary issues, including heart attacks.

“Interest in ticks as a model for developing drugs that prevent blood clotting – [often] the cause of heart attacks and strokes – is firmly rooted in evolutionary biology,” said Professor Richard Becker, a co-author of the study.

“Analysis of backbone structures suggest a novel evolutionary pathway by which different blood clot inhibiting properties evolved through a series of gene duplication events. Comparison of naturally occurring blood clot inhibitors of differing tick species suggests an evolutionary divergence approximately 100 million years ago.”

Prof Becker and his international colleagues discovered DTIs from tick salivary transcriptomes and optimised their use as a pharmaceutical. The most potent is a key regulating enzyme in blood clot formation with very high specificity and binding capacity that is almost 500 times that of bivalirudin, a drug used during a typical nonsurgical procedure used to treat narrowing of the coronary arteries. Those minimally invasive procedures are performed in roughly 1 million persons yearly in the United States.

“Despite their greater ability to reduce the incidence of the formation of blood clots, the drugs demonstrated less bleeding, achieving a wider therapeutic index in nonhuman models,” Becker says. “The higher potency of the drug means it’s not necessary to use a lot of it in treating patients, which holds the cost of goods and manufacturing down.”

According to Prof Becker, tick saliva, as in other blood-feeding such as mosquitoes, contains pharmacological and immunological active compounds, which modulate immune responses and induce antibody production. This research leveraged an understanding of tick-host interactions and antibody formation.

“The holy grail of anticoagulant therapy has always been specificity, selectivity, efficacy and safety,” said Prof Becker. “Clinician-scientists must have the training and an environment that embraces asking questions and finding solutions, including those potential found deep within nature. An ability to both measure and adjust the drug dose and rapidly reverse its effects is particularly important for safety purposes. The next step is to complete pharmacology, toxicology, drug stability and other important regulatory steps before conducting clinical trials in humans.”

Source: University of Cincinnati

New Recommendations for Earlier Breast Cancer MRI Screening

This screening MRI detected a very small cancer (circled) in the patient’s breast.
Credit: Dr. Kathyrn Lowry

Annual MRI screenings starting at ages 30 to 35 may slash breast-cancer mortality by more than 50% among women with genetic changes in three genes, according to a study published in JAMA Oncology.

The pathogenic variants are in the ATM, CHEK2 and PALB2 genes – which collectively are as prevalent as the much-reported BRCA1/2 gene mutations. The study authors state that their findings support earlier MRI screening in these women.

“Screening guidelines have been difficult to develop for these women because there haven’t been clinical trials to inform when to start and how to screen,” said lead author Dr Kathryn Lowry.

The work was a collaboration of the Cancer Intervention and Surveillance Modeling Network (CISNET), the Cancer Risk Estimates Related to Susceptibility (CARRIERS) consortium, and the Breast Cancer Surveillance Consortium.

To arrive at their model, the researchers input age-specific risk estimates from CARRIERS involving some 64 000 women and recent published data for screening performance.

“For women with pathogenic variants in these genes, our modeling analysis predicted a lifetime risk of developing breast cancer at 21% to 40%, depending on the variant,” Dr Lowry said. “We project that starting annual MRI screening at age 30 to 35, with annual mammography starting at age 40, will reduce cancer mortality for these populations of women by more than 50%.”

The simulations compared the combined performance of mammography and MRI against mammography alone, and projected that annual MRI conferred significant additional benefit to these populations.

“We also found that starting mammograms earlier than age 40 did not have a meaningful benefit but increased false-positive screens,” Dr Lowry added.

Results from CISNET models have informed past guidelines, including the 2009 and 2016 U.S. Preventive Services Task Force recommendations for breast cancer screening in average-risk women.

“Modelling is a powerful tool to synthesise and extend clinical trial and national cohort data to estimate the benefits and harms of different cancer control strategies at population levels,” said senior author Dr Jeanne Mandelblatt.

The study projected about four false-positive screening results and one to two benign biopsies per woman over a 40-year screening span, the authors noted.

To get any benefit from genetic susceptibility-based screening guidelines, a woman would have to know beforehand that she carries the gene, yet most often a genetic test panel is done after a positive cancer result – too late for any benefit.

“People understand very well the value of testing for variants in BRCA1 and BRCA2, the most common breast cancer predisposition genes. These results show that testing other genes, like ATM, CHEK2, and PALB2, can also lead to improved outcomes,” said senior author Dr Mark Robson.

The researchers hope their analysis will aid the National Comprehensive Cancer Network (NCCN), the American Cancer Society and other organizations that issue guidance for medical oncologists and radiologists.

“Overall what we’re proposing is slightly earlier screening than what the current guidelines suggest for some women with these variants,” said senior author Professor Allison Kurian. “For example, current NCCN guidelines recommend starting at age 30 for women with PALB2, and at 40 for ATM and CHEK2. Our results suggest that starting MRI at age 30 to 35 appears beneficial for women with any of the three variants.”

Source: University of Washington

Researchers Halt Aspirin Trial to Prevent Breast Cancer Recurrence

Source: National Cancer Institute

A large randomised trial was halted after preliminary analysis found that taking aspirin after treatment for breast cancer did not reduce the risk of disease recurrence.

Laboratory studies had previously shown that aspirin and nonsteroidal anti-inflammatory drugs (NSAIDs) reduced breast cancer growth and invasion. Non-steroidal anti-inflammatory drugs (NSAIDs) display anticancer activity through the inhibition of the COX-2 enzyme, triggering processes such as apoptosis, a reduction in proliferation and inhibition of carcinogenesis.  Several observational studies have shown a reduced risk of breast cancer mortality among regular aspirin users. 

There was a 25% higher risk of invasive recurrence in patients who took aspirin for a median of 18 months, but not statistically different from placebo (P = 0.1258). The aspirin group had an excess of all disease-related events, including death, local and distant recurrence/progression, and new primary tumours.

The results are in line with similar trials that ended while the Aspirin after Breast Cancer (ABC) trial was ongoing, Wendy Y. Chen, MD, of Dana-Farber Cancer Institute in Boston, said during a presentation at the American Society of Clinical Oncology (ASCO) Plenary Series.

“In this double-blind, placebo-controlled randomised trial, there was no benefit of aspirin 300 milligrams daily in terms of breast cancer invasive disease-free survival,” reported Dr Chen. “Although follow-up was short, the futility bound was clearly crossed. We had reached 50% of the events, and there was a numerically higher number of events in the aspirin arm. Therefore, it was unlikely that even with further follow-up there wouldn’t be any benefit associated with aspirin.”

“Although inflammation may still play a key role in cancer, it’s important to remember that aspirin may have different effects in other cancers, such as colon, or in different settings, such as primary versus secondary prevention,” she added.

Though the trial was well designed, enrolled the right population and with adequate dosing. the trial was stopped early for futility, commented Angela DeMichele, MD, of the Abramson Cancer Center at the University of Pennsylvania.

“The direction and magnitude [of the difference in events] highly preclude the possibility that there would have been a benefit with more follow-up,” said Dr DeMichele. “Although it was not statistically significant, we cannot rule out the possibility of a potential increase in breast cancer recurrence from the use of aspirin.”

“For patients and providers at this time, aspirin should not be used simply to prevent breast cancer recurrence,” she continued. “For those situations in which there are other options, decisions about aspirin use for other indications should definitely include an individualised risk/benefit discussion between physician and patient.”

The results underscore the need for prospective, randomised clinical trials to validate the effects of interventions from observational studies, she concluded.

The ABC trial involved patients under 70 with HER2-negative, high-risk breast cancer. The study randomised 3021 participants to 300 mg of aspirin daily or matching placebo for 5 years, with the primary endpoint being invasive disease-free survival. 

Dr Chen further noted that three clinical trials of aspirin or NSAID treatment ended while the ABC trial was ongoing. The Canadian-led MA.27 trial of an aromatase inhibitor plus celecoxib ended due to toxicity in the celecoxib arm. The randomised REACT trial of celecoxib in HER2-negative breast cancer showed no difference in disease-free survival after more than 6 years of follow-up.

The ASPREE trial tested low-dose aspirin on all-cause mortality in healthy older patients, and results showed a trend to increased all-cause mortality and significantly higher cancer mortality in the aspirin arm. 
During the post-presentation discussion, an audience member asked whether the results definitively ruled out a late benefit of aspirin, given that most patients had HR-positive disease wherein late relapse is not uncommon.

“It’s always frustrating when a study is closed early, and it was done in this case after we had reached 50% of the expected benefits,” said Chen. “There was an increase [in clinical events]. Not a statistically significant increase, but it was bordering on statistical significance. In order for aspirin to have a benefit, it would mean that in the second half, there would need to be a significantly decreased risk. It would basically need to flip and that would be biologically difficult to imagine.”

“I think it’s fair to say that this study doesn’t say definitively that there’s harm, but as for the likelihood of a benefit of aspirin, that would be extremely unlikely,” she said.

Source: MedPage Today

In MS, Twin Study Reveals Disease-causing T Cells

Source: Pixabay CC0

By studying the immune system of pairs of monozygotic twins to rule out genetics in cases of multiple sclerosis, researchers may have discovered a smoking gun: precursor cells of the disease-causing T cells.

Multiple sclerosis (MS) is a chronic inflammatory disease of the central nervous system CNS and the most common cause of neurological impairment in young adults. In MS, the patient’s own immune system attacks the CNS, resulting in cumulative neurological damage. The cause of MS still unclear but a variety of genetic risk factors and environmental influences have already been linked to the disease.

Genetics have already been found to be a necessary condition for developing multiple sclerosis. “Based on our study, we were able to show that about half of the composition of our immune system is determined by genetics,” said Florian Ingelfinger, a PhD candidate at the UZH Institute of Experimental Immunology. The study shows that these genetic influences, while always present in MS patients, are not on their own sufficient to trigger multiple sclerosis. In the study, 61 pairs of monozygotic twins where one twin is affected by MS whereas the co-twin is healthy were examined. From a genetic point of view, the twins were thus identical. “Although the healthy twins also had the maximum genetic risk for MS, they showed no clinical signs of the disease,” said Lisa Ann Gerdes.

With this cohort of twins, the researchers were  tease out environmental differences. “We are exploring the central question of how the immune system of two genetically identical individuals leads to significant inflammation and massive nerve damage in one case, and no damage at all in the other,” explained Professor Burkhard Becher, leader of the research team. Using identical twins let the researchers block out the genetic influence and focus on the immune system changes that were ultimately responsible for triggering MS in one twin.

The researchers harnessed state-of-the-art technologies to describe the immune profiles of the twin pairs in great detail. “We use a combination of mass cytometry and the latest methods in genetics paired with machine learning to not only identify characteristic proteins in the immune cells of the sick twin in each case, but also to decode the totality of all the genes that are switched on in these cells,” Florian Ingelfinger explained. 

“Surprisingly, we found the biggest differences in the immune profiles of MS affected twins to be in the cytokine receptors, ie the way immune cells communicate with one another. The cytokine network is like the language of the immune system,” said Ingelfinger. Increased sensitivity to certain cytokines leads to greater T cell activation in the bloodsteams of patients with multiple sclerosis. These T cells are more likely to migrate into the CNS and cause damage there. The identified cells were found to have the characteristics of recently activated cells, which were in the process of developing into fully functional T cells. “We may have discovered the cellular big bang of MS here – precursor cells that give rise to disease-causing T cells,” said Prof Becher.

“The findings of this study are particularly valuable in comparison to previous studies of MS which do not control for genetic predisposition,” said Prof Becher. “We are thus able to find out which part of the immune dysfunction in MS is influenced by genetic components and which by environmental factors. This is of fundamental importance in understanding the development of the disease.”

The study findings were reported in Nature.

Source: University of Zurich

Regenerating Bone with Messenger RNA

Photo by Cottonbro on Pexels

Researchers have developed new way to get bone to regenerate with messenger RNA, which promises to be cheaper and less expensive while having fewer side effects than the current treatment.

Although fractures normally heal, bone will not regenerate under several circumstances. When bone does not regenerate, major clinical problems could result, including amputation.

One treatment is recombinant human bone morphogenetic protein-2, or BMP-2. However, it is expensive and only moderately effective. It also produces side effects, which can be severe.

Researchers at Mayo Clinic, along with colleagues in the Netherlands and Germany, may have a viable, less risky alternative: messenger RNA. 

A study conducted on rats and published in Science Advancesshows that messenger RNA can be used at low doses to regenerate bone – and without side effects. The resulting new bone quality and biomechanical properties are also superior to that of BMP-2. Additionally, messenger RNA is a good choice for bone regeneration because it may not need repeat administrations.

Human bone develops in one of two ways: direct formation of bone cells from mesenchymal progenitor cells, or through endochondral ossification, in which cartilage forms first and then converts to bone. The BMP-2 therapy uses the former method, and the messenger RNA approach uses the latter. In general, the researchers say their work proves that this method “can heal large, critical-sized, segmental osseous defects of long bones in a superior fashion to its recombinant protein counterpart.”

Further studies are required in larger animals than rats before any translation can be considered for clinical trials.

Source: Mayo Clinic

A New Easy-to-Apply Antimicrobial Coating

Image by Quicknews

Researchers have developed an inexpensive, non-toxic coating for almost any fabric that decreases the infectivity of SARS-CoV-2 by up to 90%. It could even be developed to be applied to fabric by almost anyone.

“When you’re walking into a hospital, you want to know that pillow you’re putting your head onto is clean,” said lead author Taylor Wright, a doctoral student at the University of British Columbia. “This coating could take a little bit of the worry off frontline workers to have Personal Protection Equipment with antimicrobial properties.”

Researchers soaked fabric in a solution of an antimicrobial polymer which contains a molecule that releases reactive oxygen species when light shines on it. They then used UV light to turn this solution to a solid, fixing the coating to the fabric. “This coating has both passive and active antimicrobial properties, killing microbes immediately upon contact, which is then amped up when sunlight hits the cloth,” said senior author Professor Michael Wolf.

Both components are safe for human use, and the entire process takes about one hour at room temperature, said Wright. It also makes the fabric hydrophobic, without sacrificing fabric strength. The researchers detailed their study in American Chemical Society Applied Materials & Interfaces.

The coating can also be used on almost any fabric, with applications in hospital fabrics, masks, and activewear. While other such technologies can involve chemical waste, high energy use, or expensive equipment, the UBC method is relatively easy and inexpensive, said Wright. “All we need is a beaker and a light bulb. I’m fairly certain I could do the whole process on a stove.”

To test the coating’s antimicrobial properties, the researchers bathed treated fabric in bacterial soups of Escherichia coli and Methicillin-resistant Staphylococcus aureus (MRSA). They found that 85% of viable E. coli bacteria remained after 30 minutes, which fell to three per cent when the treated cloth was exposed to green light for the same amount of time. Similarly, 95% of viable MRSA bacteria remained, dropping to 35% under green light. No bacteria remained after four hours.

While sunlight or fluorescent lights have a lesser percentage of green in their spectrums, the team expects similar but less intense results for fabric exposed to those light sources, said Wright. “Particularly in the Pacific Northwest, it’s not always a sunny day. So, at all times you’re going to have that layer of passive protection and when you need that extra layer of protection, you can step into a lit room, or place the fabric in a room with a green light bulb – which can be found for about $35 online.”

The researchers also looked into whether the coating reduced the infectivity of SARS-CoV-2 by bathing treated fabric in a solution of the virus particles and then adding that solution to living cells to see if they could infect them. They found the passive properties were ineffective against the virus, but when treated fabric was exposed to green light for two hours, there was up to a 90% drop in the virus’ infectivity. “In other words, only one tenth of the amount of virus signal was detected on cells infected with the UV-fabric and light treated virus”, says co-author Professor François Jean.

The team found they needed an 18cm2 piece of fabric to kill microbes with material containing 7% of the active ingredient by weight, but that increasing this to 23% increased the effectiveness of the fabric at four times less material, said Wright.

Researchers also found that keeping the fabric under green light for more than 24 hours failed to produce the sterilising forms of oxygen, highlighting an area for further study. This is a similar effect to the colour fading on clothing after being exposed to sunlight for too long.

“Biomanufacturing face masks based on this new UBC technology would represent an important addition to our arsenal in the fight against COVID, in particular for highly transmissible SARS-CoV-2 variants of concern such as Omicron”, said Prof Jean. The coating can also be used for activewear, with an ‘anti-stink’ coating applied to areas where people tend to sweat, killing off the bacteria that makes us smell. Indeed, hospital fabric and activewear companies are already interested in applying the technology, and the university has applied for a patent in the United States, said Prof Wolf.

Source: University of British Columbia

‘A-Maize-ing’ Nanoparticles Target Cancer Cells Directly

Computer=generated depiction of nanoparticles

Researchers have recently developed novel nanoparticles derived from maize that can target cancer cells directly, via an immune mechanism. The results of this study, published in Scientific Reports, are encouraging, and the technique has demonstrated efficacy in treating tumour-bearing laboratory mice with no adverse effects.

Nanoparticles, or particles whose size varies between 1 and 100nm, have shown tremendous potential in many areas of science and technology, including therapeutics. However, conventional, synthetic nanoparticles are complicated and expensive to produce and alternatives such as extracellular vesicles (EVs) have mass production challenges.
Another recently emerging option is that of plant-derived nanoparticles (NPs), which can be easily produced in high levels at relatively lower costs. Like EVs, these nanoparticle-based systems also contain bioactive molecules, including polyphenols (which are known antioxidants) and microRNA, and they can serve as vehicles for targeted drug delivery.

Recently, researchers from the Tokyo University of Science (TUS) developed anti-cancer bionanoparticles, using corn (maize) as the raw material.
Lead researcher Professor Makiya Nishikawa explained: “By controlling the physicochemical properties of nanoparticles, we can control their pharmacokinetics in the body; so, we wanted to explore the nanoparticulation of edible plants. Maize, or corn, is produced in large quantities worldwide in its native form as well as in its genetically modified forms. That is why we selected it for our study.” 

The team centrifuged a super-sweet corn juice and then filtered it through a syringe filter with a 0.45μm pore size, then ultracentrifuged to obtain NPs derived from corn. The corn-derived NPs (cNPs) were approximately 80nm in diameter with a tiny net negative charge of -17mV.

The research team then set up experiments to see whether these cNPs were being taken up by various types of cells. In a series of promising results, the cNPs were taken up by multiple types of cells, including the clinically relevant colon26 tumor cells (cancer cells derived from mice), RAW264.7 macrophage-like cells, and normal NIH3T3 cells. RAW264.7 cells are commonly used as in vitro screens for immunomodulators.

The results were astounding: of the three types of cells, cNPs only significantly inhibited the growth of colon26 cells, indicating their selectivity for carcinogenic cell lines. Moreover, cNPs were able to successfully induce the release of tumour necrosis factor-α (TNF-α) from RAW264.7 cells. TNFα is primarily secreted by macrophages, natural killer cells, and lymphocytes, which help mount an anticancer response. “The strong TNFα response was encouraging and indicated the role of cNPs in treating various types of cancer,” explains Dr. Daisuke Sasaki, first author of the study and an instructor and researcher at TUS.

A luciferase-based assay revealed that the potent combination of cNPs and RAW264.7 cells significantly suppressed the proliferation of colon26 cells. Finally, the research team studied the effect of cNPs on laboratory mice bearing subcutaneous tumours. Once again, the results were astonishing: daily injections of cNPs into colon26 tumours significantly suppressed tumour growth, without causing serious side effects, or weight loss.

“By optimising nanoparticle properties and by combining them with anticancer drugs, we hope to devise safe and efficacious drugs for various cancers,” observed an optimistic Prof Nishikawa.

Source: Tokyo University of Science