Category: Ophthalmology

Tests on Animals Demonstrate that New Eye Drops can Slow Vision Loss

Model of PEDF protein alongside the 17-mer and H105A peptides. Amino acid 105, which is changed from histidine in PEDF and the 17-mer peptide to alanine in the H105A peptide, is shown in green.

Researchers at the National Institutes of Health (NIH) have developed eye drops that extend vision in animal models of a group of inherited diseases that lead to progressive vision loss in humans, known as retinitis pigmentosa. The eye drops contain a small fragment derived from a protein made by the body and found in the eye, known as pigment epithelium-derived factor (PEDF). PEDF helps preserve cells in the eye’s retina. A report on the study is published in Communications Medicine.

“While not a cure, this study shows that PEDF-based eye drops can slow progression of a variety of degenerative retinal diseases in animals, including various types of retinitis pigmentosa and dry age-related macular degeneration (AMD),” said Patricia Becerra, PhD, chief of NIH’s Section on Protein Structure and Function at the National Eye Institute and senior author of the study. “Given these results, we’re excited to begin trials of these eye drops in people.”

All degenerative retinal diseases have cellular stress in common. While the source of the stress may vary—dozens of mutations and gene variants have been linked to retinitis pigmentosa, AMD, and other disorders—high levels of cellular stress cause retinal cells to gradually lose function and die. Progressive loss of photoreceptor cells leads to vision loss and eventually blindness.

Previous research from Becerra’s lab revealed that, in a mouse model, the natural protein PEDF can help retinal cells stave off the effects of cellular stress. However, the full PEDF protein is too large to pass through the outer eye tissues to reach the retina, and the complete protein has multiple functions in retinal tissue, making it impractical as a treatment. To optimize the molecule’s ability to preserve retinal cells and to help the molecule reach the back of the eye, Becerra developed a series of short peptides derived from a region of PEDF that supports cell viability. These small peptides can move through eye tissues to bind with PEDF receptor proteins on the surface of the retina.

Model of PEDF protein alongside the 17-mer and H105A peptides. Amino acid 105, which is changed from histidine in PEDF and the 17-mer peptide to alanine in the H105A peptide, is shown in green.

In this new study, led by first author Alexandra Bernardo-Colón, Becerra’s team created two eye drop formulations, each containing a short peptide. The first peptide candidate, called “17-mer,” contains 17 amino acids found in the active region of PEDF. A second peptide, H105A, is similar but binds more strongly to the PEDF receptor. Peptides applied to mice as drops on the eye’s surface were found in high concentration in the retina within 60 minutes, slowly decreasing over the next 24 to 48 hours. Neither peptide caused toxicity or other side effects.

When administered once daily to young mice with retinitis pigmentosa-like disease, H105A slowed photoreceptor degeneration and vision loss. To test the drops, the investigators used specially bred mice that lose their photoreceptors shortly after birth. Once cell loss begins, the majority of photoreceptors die in a week. When given peptide eye drops through that one-week period, mice retained up to 75% of photoreceptors and continued to have strong retinal responses to light, while those given a placebo had few remaining photoreceptors and little functional vision at the end of the week.

“For the first time, we show that eye drops containing these short peptides can pass into the eye and have a therapeutic effect on the retina,” said Bernardo-Colón. “Animals given the H105A peptide have dramatically healthier-looking retinas, with no negative side effects.”

A variety of gene-specific therapies are under development for many types of retinitis pigmentosa, which generally start in childhood and progress over many years. These PEDF-derived peptide eye drops could play a crucial role in preserving cells while waiting for these gene therapies to become clinically available.

To test whether photoreceptors preserved through the eye drop treatment are healthy enough for gene therapy to work, collaborators Valeria Marigo, PhD and Andrea Bighinati, PhD, University of Modena, Italy, treated mice with gene therapy at the end of the week-long eye drop regimen. The gene therapy successfully preserved vision for at least an additional six months.  

To see whether the eye drops could work in humans – without actually testing in humans directly – the researchers worked with Natalia Vergara, PhD, University of Colorado Anschutz, Aurora, to test the peptides in a human retinal tissue model of retinal degeneration. Grown in a dish from human cells, the retina-like tissues were exposed to chemicals that induced high levels of cellular stress. Without the peptides, the cells of the tissue model died quickly, but with the peptides, the retinal tissues remained viable. These human tissue data provide a key first step supporting human trials of the eye drops.

Source: NIH/National Eye Institute

The Pupil as a Window into the Sleeping Brain

The eye of the sleeping subject was kept open with a special fixation device to record the pupil movements for several hours.  (Image: Neural Control of Movement Lab / ETH Zurich)

For the first time, researchers have been able to observe how the pupils react during sleep over a period of several hours. A look under the eyelids showed them that more happens in the brain during sleep than was previously assumed.

While eyes are typically closed in sleep, there is a flurry of activity taking place beneath the eyelids: a team of researchers, led by principal investigators Caroline Lustenberger, Sarah Meissner and Nicole Wenderoth from the Neural Control of Movement Lab at ETH Zurich, have observed that the size of the pupil fluctuates constantly during sleep. As they report in Nature Communications, sometimes it increases in size, sometimes it decreases; sometimes these changes occur within seconds, other times over the course of several minutes.

“These dynamics reflect the state of arousal, or the level of brain activation in regions that are responsible for sleep-wake regulation,” says Lustenberger. “These observations contradict the previous assumption that, essentially, the level of arousal during sleep is low.”

Instead, these fluctuations in pupil size show that even during sleep, the brain is constantly switching between a higher and lower level of activation. These new findings also confirm for humans what other research groups have recently discovered in studies on rodents, who also exhibit slow fluctuations in the activation level (known in the field as arousal).

New method for an old mystery

The regions of the brain which control the activation level are situated deep within the brainstem, making it previously difficult to directly measure these processes in humans during sleep. Existing methods are technically demanding and have not yet been established in this context. The ETH researchers’ study therefore relies on pupil measurements. Pupils are known to indicate the activation level when a person is awake. They can therefore be used as markers for the activity in regions situated deeper within the brain.

The ETH researchers developed a new method for examining the changes in people’s pupils while asleep: using a special adhesive technique and a transparent plaster, they were able to keep the eyes of the test subjects open for several hours.

“Our main concern was that the test subjects would be unable to sleep with their eyes open. But in a dark room, most people forget that their eyes are still open and they are able to sleep,” explains the study’s lead author, Manuel Carro Domínguez, who developed the technique.

Analysis of the data showed that pupil dynamics is related not just to the different stages of sleep, but also to specific patterns of brain activity, such as sleep spindles and pronounced deep sleep waves – brain waves that are important for memory consolidation and sleep stability. The researchers also discovered that the brain reacts to sounds with varying degrees of intensity, depending on the level of activation, which is reflected in the size of the pupil.

A central regulator of the activation level is a small region in the brainstem, known as the locus coeruleus. In animals, scientists have been able to show that this is important for the regulation of sleep stages and waking. The ETH researchers were unable to prove in this study whether the locus coeruleus is indeed directly responsible for pupil changes. “We are simply observing pupil changes that are related to the level of brain activation and heart activity,” Lustenberger explains.

In a follow-up study, the researchers will attempt to influence the activity of the locus coeruleus using medication, so that they can investigate how this affects pupil dynamics. They hope to discover whether this region of the brain is in fact responsible for controlling the pupils during sleep, and how changes in the level of activation affect sleep and its functions.

Using pupillary dynamics to diagnose illnesses

Understanding pupil dynamics during sleep could also provide important insights for the diagnosis and treatment of sleep disorders and other illnesses. The researchers therefore want to investigate whether pupil changes during sleep can provide indications of dysfunctions of the arousal system. These include disorders such as insomnia, post-traumatic stress disorder and possibly Alzheimer’s. “These are just hypotheses that we want to investigate in the future,” says Lustenberger.

Another goal is to make the technology usable outside of sleep laboratories, such as in hospitals where it could help to monitor waking in coma patients or to diagnose sleep disorders more accurately. The pupil as a window onto the brain could thus pave the way for new opportunities in sleep medicine and neuroscience.

Source: ETH Zurich

Novel Stem Cell Therapy Repairs Irreversible Corneal Damage in Clinical Trial

Photo by Victor Freitas on Pexels

An expanded clinical trial that tested a ground-breaking, experimental stem cell treatment for blinding cornea injuries found the treatment was feasible and safe in 14 patients who were treated and followed for 18 months, and there was a high proportion of complete or partial success. The results of this new phase 1/2 trial are published in Nature Communications.

The treatment, called cultivated autologous limbal epithelial cells (CALEC), was developed at Mass Eye and Ear, a member of the Mass General Brigham healthcare system. The innovative procedure consists of removing stem cells from a healthy eye with a biopsy, expanding them into a cellular tissue graft in a novel manufacturing process that takes two to three weeks, and then surgically transplanting the graft into the eye with a damaged cornea.

“Our first trial in four patients showed that CALEC was safe and the treatment was possible,” said principal investigator Ula Jurkunas, MD, associate director of the Cornea Service at Mass Eye and Ear and professor of Ophthalmology at Harvard Medical School. “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.”

Researchersshowed CALEC completely restored the cornea in 50% of participants at their 3-month visit and that rate of complete success increased to 79% and 77% at their 12- and 18-month visits, respectively. 

With two participants meeting the definition of partial success at 12 and 18 months, the overall success of CALEC was 93% and 92% at 12 and 18 months.  Three participants received a second CALEC transplant, one of whom reached complete success by the study end visit. An additional analysis of CALEC’s impact on vision showed varying levels of improvement of visual acuity in all 14 CALEC patients.

CALEC displayed a high safety profile, with no serious events occurring in either the donor or recipient eyes. One adverse event, a bacterial infection, occurred in one participant, eight months after the transplant due to chronic contact lens use. Other adverse events were minor and resolved quickly following the procedures.

CALEC remains an experimental procedure and is currently not offered at Mass Eye and Ear or any U.S. hospital, and additional studies will be needed before the treatment is submitted for federal approval.

The cornea is the clear, outermost layer of the eye. It’s outer border, the limbus, contains a large volume of healthy stem cells called limbal epithelial cells, which maintain the eye’s smooth surface. When a person suffers a cornea injury, such as a chemical burn, infection or other trauma, it can deplete the limbal epithelial cells, which can never regenerate. The resulting limbal stem cell deficiency renders the eye with a permanently damaged surface where it can’t undergo a corneal transplant, the current standard of care for vision rehabilitation. People with these injuries often experience persistent pain and visual difficulties.

This need led Jurkunas as a junior scientist and Dana, director of the Cornea Service at Mass Eye and Ear, to explore a new approach for regenerating limbal epithelial cells. Nearly two decades later, following preclinical studies and collaborations with researchers at Dana-Farber and Boston Children’s, it was possible to consistently manufacture CALEC grafts that met stringent quality criteria needed for human transplantation.

As an autologous therapy, one limitation of this approach is that it is necessary for the patient to have only one involved eye so a biopsy can be performed to get starting material from the unaffected normal eye.

“Our future hope is to set up an allogeneic manufacturing process starting with limbal stem cells from a normal cadaveric donor eye,” said Ritz “This will hopefully expand the use of this approach and make it possible to treat patients who have damage to both eyes.”

Source: Mass Eye and Ear

All in the Eyes: High Resolution Retinal Maps Aid Disease Diagnoses

Photoreceptor cells in the retina. Credit: Scientific Animations

Researchers have conducted one of the largest eye studies in the world to reveal new insights into retinal thickness, highlighting its potential in the early detection of diseases like type 2 diabetes, dementia and multiple sclerosis.

The WEHI-led study, using cutting-edge artificial intelligence technology to analyse over 50 000 eyes from the UK Biobank, producing maps of the retina in unprecedented detail to better understand how retinal differences link to various diseases.

The findings, published in Nature Communications, open up new possibilities for using routine eyecare imaging as a tool to screen for and manage diseases, much like mammograms have for breast cancer.

Unlocking a window into the brain

The retina is part of the central nervous system, which also comprises the brain and spinal cord. Many diseases are linked to degeneration or disruption of this critical system, including neurodegenerative conditions such as dementia and metabolic disorders like diabetes.

Globally, neurological conditions alone are one of the leading causes of disability and illness, with over 3 billion people, or 43% of the world’s population living with a brain related condition.

Lead researcher, WEHI’s Dr Vicki Jackson, said the findings broaden the horizons for using retinal imaging as a doorway into the central nervous system, to help manage disease.

“We’ve shown that retinal imaging can act as a window to the brain, by detecting associations with neurological disorders like multiple sclerosis and many other conditions,” said Dr Jackson, a statistician and gene expert.

“Our maps’ fine-scale measurements reveal critical new details about connections between retinal thinning and a range of common conditions.”

The study also identified new genetic factors that influence retinal thickness, which are likely to play a role in the growth and development of a person’s retina.

“This research underscores the potential for retinal thickness to act as a diagnostic biomarker to aid in detecting and tracking the progression of numerous diseases. We can now pinpoint specific locations of the retina which show key changes in some diseases.”

The international research team, led by WEHI, applied AI methods to big population data of retinal imaging and compared information about each person’s genetics and health to reveal unprecedented links to disease.

The results created 50 000 maps with measurements at over 29 000 locations across the retina, identifying retinal thinning relating to 294 genes that play an important role in disease.

AI fast-tracking the diagnostic future

Study lead and bioinformatician, Professor Melanie Bahlo AM, said past studies had indicated correlations between retinal thickness and disease, but her team’s AI-powered discoveries shed deeper light on the complex spatial anatomy of the retina and its role in disease.

“Technologies like AI fuel discovery, and when fused with brilliant minds, there is an extraordinary ability to transform big population data into far-reaching insights,” Prof Bahlo, a lab head at WEHI, said.

“There has never been a time in history where this powerful combination — technology, big data and brilliant minds — has come together to advance human health.”

The research reinforces the growing field of oculomics (using the eye to diagnose health conditions) as an emerging, powerful and non-invasive approach for predicting and diagnosing diseases.

Source: Walter and Eliza Hall Institute

Study Shows Effectiveness of Method to Stem Myopia

Photo by Ksenia Chernaya

Capping ten years of work to stem the tide of myopia, David Berntsen, Professor of Optometry at the University of Houston, is reporting that his team’s method to slow myopia not only works – but lasts.

The original Bifocal Lenses In Nearsighted Kids (BLINK) Study showed that having children with myopia wear high-add power multifocal contact lenses slows its progression. Now, new results from the BLINK2 Study, that continued following these children, found that the benefits continue even after the lenses are no longer used.

“We found that one year after discontinuing treatment with high-add power soft multifocal contact lenses in older teenagers, myopia progression returns to normal with no loss of treatment benefit,” reports Berntsen in JAMA Ophthalmology.

The study was funded by the National Institutes of Health’s National Eye Institute with collaborators from the Ohio State University College of Optometry.

In Focus: A Major Issue

Leading the team at the University of Houston, Berntsen takes on a significant challenge. By 2050 almost 50% of the world (5 billion people) will be myopic. Myopia is associated with an increased risk of long-term eye health problems that affect vision and can even lead to blindness.

From the initial study, high-add multifocal contact lenses were found to be effective at slowing the rate of eye growth, decreasing how myopic children became. Because higher amounts of myopia are associated with vision-threatening eye diseases later in life, like retinal detachment and glaucoma, controlling its progression during childhood potentially offers an additional future benefit.

“There has been concern that the eye might grow faster than normal when myopia control contact lenses are discontinued. Our findings show that when older teenagers stop wearing these myopia control lenses, the eye returns to the age-expected rate of growth,” said Berntsen.

“These follow-on results from the BLINK2 Study show that the treatment benefit with myopia control contact lenses have a durable benefit when they are discontinued at an older age,” said BLINK2 study chair, Jeffrey J. Walline, associate dean for research at the Ohio State University College of Optometry.

Eye Science

Myopia occurs when a child’s developing eyes grow too long from front to back. Instead of focusing images directly on the retina, they are focused at a point in front of the retina.

Single vision prescription glasses and contact lenses can correct myopic vision, but they fail to treat the underlying problem, which is the eye continuing to grow longer than normal. By contrast, soft multifocal contact lenses correct myopic vision in children while simultaneously slowing myopia progression by slowing eye growth.

Designed like a bullseye, multifocal contact lenses focus light in two basic ways. The centre portion of the lens corrects nearsightedness so that distance vision is clear, and it focuses light directly on the retina. The outer portion of the lens adds focusing power to bring the peripheral light into focus in front of the retina. Animal studies show that bringing light to focus in front of the retina may slow growth. The higher the reading power, the further in front of the retina it focuses peripheral light.

BLINK Once…Then Twice

In the original BLINK study, 294 myopic children, ages 7 to 11 years, were randomly assigned to wear single vision contact lenses or multifocal lenses with either high-add power (+2.50 diopters) or medium-add power (+1.50 diopters). They wore the lenses during the day as often as they could comfortably do so for three years. All participants were seen at clinics at the Ohio State University, Columbus, or at the University of Houston.

After three years in the original BLINK study, children in the high-add multifocal contact lens group had shorter eyes compared to the medium-add power and single-vision groups, and they also had the slowest rate of myopia progression and eye growth.

Of the original BLINK participants, 248 continued in BLINK2, during which all participants wore high-add (+2.50 diopters) lenses for two years, followed by single-vision contact lenses for the third year of the study to see if the benefit remained after discontinuing treatment.

At the end of BLINK2, axial eye growth returned to age-expected rates. While there was a small increase in eye growth of 0.03 mm/year across all age groups after discontinuing multifocal lenses, it is important to note that the overall rate of eye growth was no different than the age-expected rate. There was no evidence of faster than normal eye growth.

Participants who had been in the original BLINK high-add multifocal treatment group continued to have shorter eyes and less myopia at the end of BLINK2. Children who were switched to high-add multifocal contact lenses for the first time during BLINK2 did not catch up to those who had worn high-add lenses since the start of the BLINK Study when they were 7 to 11 years of age.

By contrast, studies of other myopia treatments, such as atropine drops and orthokeratology lenses that are designed to temporarily reshape the eye’s outermost corneal layer, showed a rebound effect (faster than age-normal eye growth) after treatment was discontinued.

“Our findings suggest that it’s a reasonable strategy to fit children with multifocal contact lenses for myopia control at a younger age and continue treatment until the late teenage years when myopia progression has slowed,” said Berntsen.

Source: University of Houston

New Potential Treatment for Inherited Blinding Disease Retinitis Pigmentosa

Researchers used a computer screening approach to identify two compounds that could help prevent vision loss in people with a genetic eye disease

Photoreceptor cells in the retina. Credit: Scientific Animations

Two new compounds may be able to treat retinitis pigmentosa, a group of inherited eye diseases that cause blindness. The compounds, described in a study published January 14th in the open-access journal PLOS Biology by Beata Jastrzebska from Case Western Reserve University, US, and colleagues, were identified using a virtual screening approach.

In retinitis pigmentosa, the retina protein rhodopsin is often misfolded due to genetic mutations, causing retinal cells to die off and leading to progressive blindness. Small molecules to correct rhodopsin folding are urgently needed to treat the estimated 100 000 people in the United States with the disease. Current experimental treatments include retinoid compounds, such as synthetic vitamin A derivatives, which are sensitive to light and can be toxic, leading to several drawbacks.

In the new study, researchers utilised virtual screening to search for new drug-like molecules that bind to and stabilise the structure of rhodopsin to improve its folding and movement through the cell. Two non-retinoid compounds were identified which met these criteria and had the ability to cross the blood-brain and blood-retina barriers. The team tested the compounds in the lab and showed that they improved cell surface expression of rhodopsin in 36 of 123 genetic subtypes of retinitis pigmentosa, including the most common one. Additionally, they protected against retinal degeneration in mice with retinitis pigmentosa.

“Importantly, treatment with either compound improved the overall retina health and function in these mice by prolonging the survival of their photoreceptors,” the authors say. However, they note that additional studies of the compounds or related compounds are needed before testing the treatments in humans.

The authors add, “Inherited mutations in the rhodopsin gene cause retinitis pigmentosa (RP), a progressive and currently untreatable blinding disease. This study identifies small molecule pharmacochaperones that suppress the pathogenic effects of various rhodopsin mutants in vitro and slow photoreceptor cell death in a mouse model of RP, offering a potential new therapeutic approach to prevent vision loss.”

Provided by PLOS

Is Paranoia Partly a Visual Problem?

Photo by Stormseeker on Unsplash

Could complex beliefs like paranoia have roots in something as basic as vision? A new Yale study finds evidence that they might. 

When completing a visual perception task, in which participants had to identify whether one moving dot was chasing another moving dot, those with greater tendencies toward paranoid thinking (believing others intend them harm) and teleological thinking (ascribing excessive meaning and purpose to events) performed worse than their counterparts, the study found. Those individuals more often – and confidently – claimed one dot was chasing the other when it wasn’t.

The findings, published in the journal Communications Psychology, suggest that, in the future, testing for illnesses like schizophrenia could be done with a simple eye test.

“We’re really interested in how the mind is organised,” said senior author Philip Corlett, an associate professor of psychiatry at Yale School of Medicine and member of the Wu Tsai Institute. “Chasing or other intentional behaviours are what you might think of as experiences perceived at a very high-level in the brain, that someone might have to reason through and deliberate. In this study, we can see them low down in the brain, in vision, which we think is exciting and interesting – and has implications for how those mechanisms might be relevant for schizophrenia.”

Paranoia and teleological thinking are similar in that they are both misattributions of intention, but paranoia is a negative perception while teleological thinking tends to be positive. Both patterns of thinking are linked to psychosis and schizophrenia.

Hallucinations are associated with psychosis as well and are often about other people, said Corlett, suggesting there may be a social component to these visual misperceptions.

“So we wondered whether there might be something related to social perception – or misperception, what we refer to as social hallucination – that we could measure and that relate to these symptoms of psychosis,” he said.

For the task, participants were shown dots moving on a screen. Sometimes one dot was chasing another; other times there was no chase. Across different trials of the task, participants had to say whether a chase was occurring or not.

Those with higher degrees of paranoia and teleological thinking (as measured through questionnaires) were more likely than others to say with confidence that a chase was happening when one wasn’t. Essentially, they perceived a social interaction that wasn’t occurring.

In additional experiments, the researchers asked participants to identify which dot was doing the chasing and which dot was being chased. In these results, paranoia and teleological thinking began to diverge.

“People with paranoia were particularly bad at detecting which dot was being chased,” said Santiago Castiello, lead author of the study and a postdoctoral researcher in Corlett’s lab. “And people with high teleology were particularly bad at detecting which dot was doing the chasing.”

That these two types of beliefs differed in this way highlights that they are distinct and may have implications for diagnosis or treatment, said the researchers. The connection to vision may also shift thinking around how the brain gives rise to psychotic symptoms.

“Very few people with congenital blindness develop schizophrenia,” said Castiello. “Finding these social hallucinations in vision makes me wonder if schizophrenia is something that develops through errors in how people sample the visual world.”

While there are no immediate therapeutic implications from these findings, deeper understanding of these beliefs could aid in pharmacological treatment development and risk assessment. 

“One thing we’re thinking about now is whether we can find eye tests that predict someone’s risk for psychosis,” said Corlett. “Maybe there is some very quick perceptual task that can identify when someone might need to talk to a clinician.”

Source: Yale University

Why Some Patients Respond Poorly to Wet Macular Degeneration Treatment

Retina showing reticular pseudodrusen. Although they can infrequently appear in individuals with no other apparent pathology, their highest rates of occurrence are in association with age-related macular degeneration (AMD), for which they hold clinical significance by being highly correlated with end-stage disease sub-types, choroidal neovascularisation and geographic atrophy. Credit: National Eye Institute

A new study from researchers at Wilmer Eye Institute, Johns Hopkins Medicine explains not only why some patients with wet age-related macular degeneration (or “wet” AMD) fail to have vision improvement with treatment, but also how an experimental drug could be used with existing wet AMD treatments to save vision.

Wet AMD, one of two kinds of AMD, is a progressive eye condition caused by an overgrowth of blood vessels in the retina. Such blood vessels – caused by an overexpression of a protein known as VEGF that leads to blood vessel growth – then leak fluid or bleed and damage the retina, causing vision loss.

Despite the severe vision loss often experienced by people with wet AMD, less than half of patients treated with monthly eye injections, known as anti-VEGF therapies, show any major vision improvements. Additionally, for those who do benefit with improved vision, most will lose those gains over time.

Now, in the full report published November 4 in the Proceedings of the National Academy of Sciences, the Wilmer-led team of researchers share how such anti-VEGF therapies may actually contribute to lack of vision improvements by triggering the overexpression of a second protein. Known as ANGPTL4, the protein is similar to VEGF, as it can also stimulate overproduction of abnormal blood vessels in the retina.

“We have previously reported that ANGPTL4 was increased in patients who did not respond well to anti-VEGF treatment,” says Akrit Sodhi, MD, PhD, corresponding author and associate professor of ophthalmology at the Johns Hopkins University School of Medicine and the Wilmer Eye Institute. “What we saw in this paper was a paradoxical increase of ANGPTL4 in patients that received anti-VEGF injections – the anti-VEGF therapy itself turned on expression of this protein.”

The team compared VEGF and ANGPTL4 levels in the eye fluid of 52 patients with wet AMD at various stages of anti-VEGF treatment. Prior to anti-VEGF injections, patients with wet AMD had high levels of ANGPTL4 and VEGF proteins. After treatment, their VEGF levels predictably decreased, yet ANGPTL4 levels rose higher, indicating ANGPTL4 remained active following the anti-VEGF injections and the treatments contributed to an increase in ANGPTL4. Such ANGPTL4 activity can lead to blood vessel overgrowth and lack of vision improvement.

The team then investigated ways to bridge the gap between patients with increased ANGPTL4 following anti-VEGF treatments by testing the experimental drug 32-134D in mice with wet AMD. The drug decreases levels of a third protein, HIF-1, known to be involved in wet AMD and diabetic eye disease for its role in switching on VEGF production. Researchers believed the HIF-inhibitor 32-134D would have a similar effect on ANGPTL4 following anti-VEGF treatment, since ANGPTL4 production is also turned on by HIF-1.

In mice treated with 32-134D, the team observed a decrease in HIF-1 levels and VEGF, as well as decreased levels of ANGPTL4 and blood vessel overgrowth. Mice treated only with anti-VEGF therapies corroborated the team’s findings in human patients: levels of VEGF were lower, yet ANGPTL4 levels rose, preventing anti-VEGF therapies from fully working to prevent blood vessel growth (and vision loss). Researchers also determined that combining 32-134D with anti-VEGF treatments prevented the increase in HIF-1, VEGF and ANGPTL4. This treatment combination was more effective than either drug alone, showing promise for treating wet AMD.

“This work exposes a way to improve anti-VEGF therapy for all patients and potentially help a subset of patients with wet AMD who still lose vision over time despite treatment,” Sodhi says. “Our hope is that this [project] will further the three goals we have related to wet AMD: make current therapies as effective as possible, identify new therapies, and prevent people from ever getting wet AMD.”

Source: John Hopkins Medicine

AI Eye to Eye with Ophthalmologists in Diagnosing Corneal Infections

Photo by Victor Freitas on Pexels

A Birmingham-led study has found that AI-powered models match ophthalmologists in diagnosing infectious keratitis, offering promise for global eye care improvements.

Infectious keratitis (IK) is a leading cause of corneal blindness worldwide. This new study finds that deep learning models showed similar levels of accuracy in identifying infection.

In a meta-analysis study published in eClinicalMedicine, Dr Darren Ting from the University of Birmingham conducted a review with a global team of researchers analysing 35 studies that utilised Deep Learning (DL) models to diagnose infectious keratitis.

AI models in the study matched the diagnostic accuracy of ophthalmologists, exhibiting a sensitivity of 89.2% and specificity of 93.2%, compared to ophthalmologists’ 82.2% sensitivity and 89.6% specificity.

The models in the study had analysed a combined total of more than 136 000 corneal images, and the authors say that the results further demonstrate the potential use of artificial intelligence in clinical settings.

Dr Darren Ting, Senior author of the study, Birmingham Health Partners (BHP) Fellow and Consultant Ophthalmologist, University of Birmingham said: “Our study shows that AI has the potential to provide fast, reliable diagnoses, which could revolutionise how we manage corneal infections globally. This is particularly promising for regions where access to specialist eye care is limited, and can help to reduce the burden of preventable blindness worldwide.”

The AI models also proved effective at differentiating between healthy eyes, infected corneas, and the various underlying causes of IK, such as bacterial or fungal infections.

While these results highlight the potential of DL in healthcare, the study’s authors emphasised the need for more diverse data and further external validation to increase the reliability of these models for clinical use.

Infectious keratitis, an inflammation of the cornea, affects millions, particularly in low- and middle-income countries where access to specialist eye care is limited. As AI technology continues to grow and play a pivotal role in medicine, it may soon become a key tool in preventing corneal blindness globally.

Source: University of Birmingham

The Search for an Effective Treatment for Glaucoma

Photo by Ksenia Chernaya

Pete Williams at Karolinska Institutet is one of the few researchers in Sweden concentrating on glaucoma. The goal is an effective treatment, something that stops the degenerative process in the nerve cells of the eye. He is the senior author of a new paper in Nature Communications on how deficiency of the enzyme NMNAT2 renders the nerve cells of the eye vulnerable to neurodegeneration and could be a key in the search for a treatment. 

Glaucoma is very common. Eighty million people worldwide have the eye disease. There is no cure, but there are treatments that lower the pressure in the eye and that can slow down the progression of the disease, which otherwise leads to irreversible blindness.

Not always treatable

“Most people who have heard of glaucoma believe that it can be treated with eye drops and surgery. Unfortunately, this is not entirely true. For many of the patients, the treatment lower the eye pressure but doesn’t prevent further vision loss,” says Pete Williams.

Knowledge about glaucoma has taken time to develop because the disease progresses slowly. This means that in the past, it took many years before researchers could see if a particular treatment had any effect. However, in the last decade or so, the availability of instruments that measure changes in the eye much earlier than the patient experiences them has given new impetus to research into the eye disease.

The importance of NAD

In the 1980s, research into neurodegenerative diseases discovered a link with NAD, a co-enzyme, ie a molecule that binds to an enzyme and makes it active.  Pete Williams’ group was the first to show that NAD levels were low in animal models of glaucoma.

“NAD has many important functions in the body. A lack of it is important for neuronal health and survival in many diseases, but we don’t yet know how to use this information to create a better treatment ” says Pete Williams.

When the body makes NAD, it uses an enzyme NMNAT1. However, in neurons, it needs another enzyme: NMNAT2 (which is only found in neurons). 

“In our recent paper in Nature Communications, we show that NMNAT2 is needed to protect neurons in the eye and that gene therapy can be used to increase levels,” says Pete Willams. “

The research team is now moving on to try to develop new substances that target NMNAT2 in nerve cells. 

Source: Karolinksa Institutet