Category: Ophthalmology

The Rise in Dry Eye Disease Among Young Adults

Photo by Steinar Engeland on Unsplash

Researchers at Aston University have called for more advice to be given to young people about preventing dry eye disease, after a study carried out in conjunction with Oslo University Hospital and Sørlandet Hospital Trust in Norway found that 90% of participants had at least one sign of the condition in their eyes.

Dry eye disease occurs when the eyes do not make enough tears, or make poor-quality tears without sufficient lipid or mucus levels which leads to poor tear film stability and rapid evaporation. Sufferers may have gritty feeling eyes, itching or stinging in the eyes, red eyes, sensitivity to light and blurry vision. There are several risk factors for dry eye disease, including stress and wearing contact lenses. It is also more prevalent in females. In the 18-25 age group, a major risk factor is screen use.

The research, following 50 18-25-year-olds over time, was led by Dr Rachel Casemore at Aston University School of Optometry and is the first of its kind. It was published in The Ocular Surface. The researchers looked for symptoms of dry eye disease in the participants, studied lifestyle factors, and followed up with participants one year on to find out if there had been any progression of the condition.

The initial study showed that 56% of participants had dry eye disease, while 90% had at least one symptom of the condition. Around half of the participants in the study had lost at least 25% of a type of gland in the eye called the meibomian gland. These glands produce the outer lipid layer of the eye’s tear film, which is responsible for preventing evaporation of tears, and therefore keeps the tear film stable and the eye moist. One year on, the researchers found that there had been significant progression of dry eye disease in the study participants.

Additionally, the researchers found correlation found between how long the study group used screens and signs of dryness on the eye surface. The average screen use of participants was eight hours per day.

The researchers concluded that the evidence of dry eye disease symptoms and progression in the young adults in their study shows the need for early detection of potential signs, and the identification of those who may go on to develop dry eye disease. These individuals can then be advised on managing the condition before progression.

The progression and development of dry eye disease can be slowed by various methods. Dr Casemore says that the simplest ways are to take regular screen breaks, to carry out blink exercises to ensure the release of oils from the meibomian glands and to keep hydrated. A healthy, balanced diet, including sources of omega-3 fatty acids, such as oily fish, is also important, as is regular sleep patterns.

Dr Casemore suggests that those with irregular sleep patterns, such as those caused by sleep disorders or anxiety, should seek advice. People who wear contact lenses need to ensure they get regular check-ups to ensure optimum fitting, and that they adhere to their replacement schedule, wearing time schedule, cleaning regimes and safety advice, such as no sleeping, showering or swimming in contact lenses.

Dr Casemore said:

“It is concerning to note the increasing prevalence of dry eye disease signs and symptoms in young adults, which has been referred to as a ‘lifestyle epidemic’ by some researchers. Eye care practitioners are well placed to identify the clinical indicators of dry eye disease and counsel young adults around modifiable risk factors, such as screen use habits, sleeping habits, contact lens use, diet, blinking patterns, and management of stress levels.

“Our future research aims to continue investigation of the potential tear and meibomian gland oil biomarkers which were identified during the study and further explore the effect of diet on dry eye disease development.”

Source: Aston University

Genetic Schizophrenic Susceptibility Could Show up in the Retina

Photoreceptor cells in the retina. Credit: Scientific Animations

Could the eyes, which are directly connected to the brain, hold clues to brain changes? An international team of researchers led by the University of Zurich and the University Hospital of Psychiatry Zurich has now tackled this very question. In their study, published in Nature Mental Health, the researchers examined whether changes in our nerve connections are linked to a genetic risk for schizophrenia, as impaired neural information processing is one of the main characteristics of the disorder.

Previous studies suggest that schizophrenia not only reduces volume of grey matter in the brains of those affected, but that it also leads to loss of retinal tissue. But whether these changes are the cause of schizophrenia or a consequence of the disorder has remained unanswered. Retinal health could also be affected by schizophrenia itself, for example, through antipsychotic medication, lifestyle factors or diabetes.

Extensive use of data from healthy individuals

“To investigate whether the risk of developing schizophrenia has an effect on the central nervous system, we examined tens of thousands of healthy individuals,” says Finn Rabe, first author of the study and postdoc at the University of Zurich. “We then calculated polygenic risk scores for each individual.”

The researchers were able to use extensive genetic and retinal data taken from the UK Biobank, a large biomedical database containing data from over half a million people. “You could say that the scale of the UK Biobank’s data has revolutionised biomedical research,” the researcher adds.

Thin retina, elevated risk

The study shows that higher genetic susceptibility to schizophrenia is indeed associated with thinner retinas. The effects are small, though, and can only be reliably demonstrated in large-scale studies. One of the study’s findings is that, unlike changes in the brain, changes in the retina are easy to detect using non-invasive and inexpensive retinal measurements. Thanks to optical coherence tomography, which can be described as a kind of ultrasound for the eye, retinal thickness can be measured in minutes.

This offers a promising outlook for prevention. “Our study shows the potential of using optical coherence tomography in clinical practice. But large-scale longitudinal studies are needed to examine how useful it will be for prevention,” says Finn Rabe.

Perspectives for new therapies

Another key finding of the study concerns genetic variants associated with inflammatory processes in the brain. These may also contribute to structural changes in the retina. The study thus offers further support for the inflammation hypothesis of schizophrenia, ie, the idea that inflammatory processes contribute to the development or progression of the disorder. “If this hypothesis is confirmed, inflammation could be interrupted by medication, potentially enabling us to improve treatment possibilities in the future,” says Rabe.

Source: University of Zurich

Goldeneye: Research on Restoring Eyesight with Gold Nanoparticles

Retina showing reticular pseudodrusen. Although they can infrequently appear in individuals with no other apparent pathology, their highest rates of occurrence are in association with age-related macular degeneration (AMD), for which they hold clinical significance by being highly correlated with end-stage disease sub-types, choroidal neovascularisation and geographic atrophy. Credit: National Eye Institute

A new study by Brown University researchers suggests that gold nanoparticles might one day be used to help restore vision in people with macular degeneration and other retinal disorders. 

In a study published in the journal ACS Nano and supported by the National Institutes of Health, the research team showed that nanoparticles injected into the retina can successfully stimulate the visual system and restore vision in mice with retinal disorders. The findings suggest that a new type of visual prosthesis system in which nanoparticles, used in combination with a small laser device worn in a pair of glasses or goggles, might one day help people with retinal disorders to see again. 

“This is a new type of retinal prosthesis that has the potential to restore vision lost to retinal degeneration without requiring any kind of complicated surgery or genetic modification,” said Jiarui Nie, research leader and now a postdoctoral researcher. “We believe this technique could potentially transform treatment paradigms for retinal degenerative conditions.” 

Nie performed the work while working in the lab of Jonghwan Lee, an associate professor in Brown’s School of Engineering and a faculty affiliate at Brown’s Carney Institute for Brain Science, who oversaw the work and served as the study’s senior author. 

Retinal disorders like macular degeneration and retinitis pigmentosa affect millions of people in the U.S. and around the world. These conditions damage light-sensitive cells in the retina called photoreceptors — the “rods” and “cones” that convert light into tiny electric pulses. Those pulses stimulate other types of cells further up the visual chain called bipolar and ganglion cells, which process the photoreceptor signals and send them along to the brain. 

This new approach uses nanoparticles injected directly into the retina to bypass damaged photoreceptors. When infrared light is focused on the nanoparticles, they generate a tiny amount of heat that activates bipolar and ganglion cells in much the same way that photoreceptor pulses do. Because disorders like macular degeneration affect mostly photoreceptors while leaving bipolar and ganglion cells intact, the strategy has the potential to restore lost vision. 

In this new study, the research team tested the nanoparticle approach in mouse retinas and in living mice with retinal disorders. After injecting a liquid nanoparticle solution, the researchers used patterned near-infrared laser light to project shapes onto the retinas. Using a calcium signal to detect cellular activity, the team confirmed that the nanoparticles were exciting bipolar and ganglion cells in patterns matched the shapes projected by the laser.

The experiments showed that neither the nanoparticle solution nor the laser stimulation caused detectable adverse side effects, as indicated by metabolic markers for inflammation and toxicity. Using probes, the researchers confirmed that laser stimulation of the nanoparticles caused increased activity in the visual cortices of the mice — an indication that previously absent visual signals were being transmitted and processed by the brain. That, the researchers say, is a sign that vision had been at least partially restored, a good sign for potentially translating a similar technology to humans. 

For human use, the researchers envision a system that combines the nanoparticles with a laser system mounted in a pair of glasses or goggles. Cameras in the goggles would gather image data from the outside world and use it to drive the patterning of an infrared laser. The laser pulses would then stimulate the nanoparticles in people’s retinas, enabling them to see. 

The approach is similar to one that was approved by the Food and Drug Administration for human use a few years ago. The older approach combined a camera system with a small electrode array that was surgically implanted in the eye. The nanoparticle approach has several key advantages, according to Nie.

For starters, it’s far less invasive. As opposed to surgery, “an intravitreal injection is one of the simplest procedures in ophthalmology,” Nie said. 

There are functional advantages as well. The resolution of the previous approach was limited by the size of the electrode array — about 60 square pixels. Because the nanoparticle solution covers the whole retina, the new approach could potentially cover someone’s full field of vision. And because the nanoparticles respond to near-infrared light as opposed to visual light, the system doesn’t necessarily interfere with any residual vision a person may retain.   

More work needs to be done before the approach can be tried in a clinical setting, Nie said, but this early research suggests that it’s possible.

“We showed that the nanoparticles can stay in the retina for months with no major toxicity,” Nie said of the research. “And we showed that they can successfully stimulate the visual system. That’s very encouraging for future applications.”

Source: Brown University

Tests on Animals Demonstrate that New Eye Drops can Slow Vision Loss

Model of PEDF protein alongside the 17-mer and H105A peptides. Amino acid 105, which is changed from histidine in PEDF and the 17-mer peptide to alanine in the H105A peptide, is shown in green.

Researchers at the National Institutes of Health (NIH) have developed eye drops that extend vision in animal models of a group of inherited diseases that lead to progressive vision loss in humans, known as retinitis pigmentosa. The eye drops contain a small fragment derived from a protein made by the body and found in the eye, known as pigment epithelium-derived factor (PEDF). PEDF helps preserve cells in the eye’s retina. A report on the study is published in Communications Medicine.

“While not a cure, this study shows that PEDF-based eye drops can slow progression of a variety of degenerative retinal diseases in animals, including various types of retinitis pigmentosa and dry age-related macular degeneration (AMD),” said Patricia Becerra, PhD, chief of NIH’s Section on Protein Structure and Function at the National Eye Institute and senior author of the study. “Given these results, we’re excited to begin trials of these eye drops in people.”

All degenerative retinal diseases have cellular stress in common. While the source of the stress may vary—dozens of mutations and gene variants have been linked to retinitis pigmentosa, AMD, and other disorders—high levels of cellular stress cause retinal cells to gradually lose function and die. Progressive loss of photoreceptor cells leads to vision loss and eventually blindness.

Previous research from Becerra’s lab revealed that, in a mouse model, the natural protein PEDF can help retinal cells stave off the effects of cellular stress. However, the full PEDF protein is too large to pass through the outer eye tissues to reach the retina, and the complete protein has multiple functions in retinal tissue, making it impractical as a treatment. To optimize the molecule’s ability to preserve retinal cells and to help the molecule reach the back of the eye, Becerra developed a series of short peptides derived from a region of PEDF that supports cell viability. These small peptides can move through eye tissues to bind with PEDF receptor proteins on the surface of the retina.

Model of PEDF protein alongside the 17-mer and H105A peptides. Amino acid 105, which is changed from histidine in PEDF and the 17-mer peptide to alanine in the H105A peptide, is shown in green.

In this new study, led by first author Alexandra Bernardo-Colón, Becerra’s team created two eye drop formulations, each containing a short peptide. The first peptide candidate, called “17-mer,” contains 17 amino acids found in the active region of PEDF. A second peptide, H105A, is similar but binds more strongly to the PEDF receptor. Peptides applied to mice as drops on the eye’s surface were found in high concentration in the retina within 60 minutes, slowly decreasing over the next 24 to 48 hours. Neither peptide caused toxicity or other side effects.

When administered once daily to young mice with retinitis pigmentosa-like disease, H105A slowed photoreceptor degeneration and vision loss. To test the drops, the investigators used specially bred mice that lose their photoreceptors shortly after birth. Once cell loss begins, the majority of photoreceptors die in a week. When given peptide eye drops through that one-week period, mice retained up to 75% of photoreceptors and continued to have strong retinal responses to light, while those given a placebo had few remaining photoreceptors and little functional vision at the end of the week.

“For the first time, we show that eye drops containing these short peptides can pass into the eye and have a therapeutic effect on the retina,” said Bernardo-Colón. “Animals given the H105A peptide have dramatically healthier-looking retinas, with no negative side effects.”

A variety of gene-specific therapies are under development for many types of retinitis pigmentosa, which generally start in childhood and progress over many years. These PEDF-derived peptide eye drops could play a crucial role in preserving cells while waiting for these gene therapies to become clinically available.

To test whether photoreceptors preserved through the eye drop treatment are healthy enough for gene therapy to work, collaborators Valeria Marigo, PhD and Andrea Bighinati, PhD, University of Modena, Italy, treated mice with gene therapy at the end of the week-long eye drop regimen. The gene therapy successfully preserved vision for at least an additional six months.  

To see whether the eye drops could work in humans – without actually testing in humans directly – the researchers worked with Natalia Vergara, PhD, University of Colorado Anschutz, Aurora, to test the peptides in a human retinal tissue model of retinal degeneration. Grown in a dish from human cells, the retina-like tissues were exposed to chemicals that induced high levels of cellular stress. Without the peptides, the cells of the tissue model died quickly, but with the peptides, the retinal tissues remained viable. These human tissue data provide a key first step supporting human trials of the eye drops.

Source: NIH/National Eye Institute

The Pupil as a Window into the Sleeping Brain

The eye of the sleeping subject was kept open with a special fixation device to record the pupil movements for several hours.  (Image: Neural Control of Movement Lab / ETH Zurich)

For the first time, researchers have been able to observe how the pupils react during sleep over a period of several hours. A look under the eyelids showed them that more happens in the brain during sleep than was previously assumed.

While eyes are typically closed in sleep, there is a flurry of activity taking place beneath the eyelids: a team of researchers, led by principal investigators Caroline Lustenberger, Sarah Meissner and Nicole Wenderoth from the Neural Control of Movement Lab at ETH Zurich, have observed that the size of the pupil fluctuates constantly during sleep. As they report in Nature Communications, sometimes it increases in size, sometimes it decreases; sometimes these changes occur within seconds, other times over the course of several minutes.

“These dynamics reflect the state of arousal, or the level of brain activation in regions that are responsible for sleep-wake regulation,” says Lustenberger. “These observations contradict the previous assumption that, essentially, the level of arousal during sleep is low.”

Instead, these fluctuations in pupil size show that even during sleep, the brain is constantly switching between a higher and lower level of activation. These new findings also confirm for humans what other research groups have recently discovered in studies on rodents, who also exhibit slow fluctuations in the activation level (known in the field as arousal).

New method for an old mystery

The regions of the brain which control the activation level are situated deep within the brainstem, making it previously difficult to directly measure these processes in humans during sleep. Existing methods are technically demanding and have not yet been established in this context. The ETH researchers’ study therefore relies on pupil measurements. Pupils are known to indicate the activation level when a person is awake. They can therefore be used as markers for the activity in regions situated deeper within the brain.

The ETH researchers developed a new method for examining the changes in people’s pupils while asleep: using a special adhesive technique and a transparent plaster, they were able to keep the eyes of the test subjects open for several hours.

“Our main concern was that the test subjects would be unable to sleep with their eyes open. But in a dark room, most people forget that their eyes are still open and they are able to sleep,” explains the study’s lead author, Manuel Carro Domínguez, who developed the technique.

Analysis of the data showed that pupil dynamics is related not just to the different stages of sleep, but also to specific patterns of brain activity, such as sleep spindles and pronounced deep sleep waves – brain waves that are important for memory consolidation and sleep stability. The researchers also discovered that the brain reacts to sounds with varying degrees of intensity, depending on the level of activation, which is reflected in the size of the pupil.

A central regulator of the activation level is a small region in the brainstem, known as the locus coeruleus. In animals, scientists have been able to show that this is important for the regulation of sleep stages and waking. The ETH researchers were unable to prove in this study whether the locus coeruleus is indeed directly responsible for pupil changes. “We are simply observing pupil changes that are related to the level of brain activation and heart activity,” Lustenberger explains.

In a follow-up study, the researchers will attempt to influence the activity of the locus coeruleus using medication, so that they can investigate how this affects pupil dynamics. They hope to discover whether this region of the brain is in fact responsible for controlling the pupils during sleep, and how changes in the level of activation affect sleep and its functions.

Using pupillary dynamics to diagnose illnesses

Understanding pupil dynamics during sleep could also provide important insights for the diagnosis and treatment of sleep disorders and other illnesses. The researchers therefore want to investigate whether pupil changes during sleep can provide indications of dysfunctions of the arousal system. These include disorders such as insomnia, post-traumatic stress disorder and possibly Alzheimer’s. “These are just hypotheses that we want to investigate in the future,” says Lustenberger.

Another goal is to make the technology usable outside of sleep laboratories, such as in hospitals where it could help to monitor waking in coma patients or to diagnose sleep disorders more accurately. The pupil as a window onto the brain could thus pave the way for new opportunities in sleep medicine and neuroscience.

Source: ETH Zurich

Novel Stem Cell Therapy Repairs Irreversible Corneal Damage in Clinical Trial

Photo by Victor Freitas on Pexels

An expanded clinical trial that tested a ground-breaking, experimental stem cell treatment for blinding cornea injuries found the treatment was feasible and safe in 14 patients who were treated and followed for 18 months, and there was a high proportion of complete or partial success. The results of this new phase 1/2 trial are published in Nature Communications.

The treatment, called cultivated autologous limbal epithelial cells (CALEC), was developed at Mass Eye and Ear, a member of the Mass General Brigham healthcare system. The innovative procedure consists of removing stem cells from a healthy eye with a biopsy, expanding them into a cellular tissue graft in a novel manufacturing process that takes two to three weeks, and then surgically transplanting the graft into the eye with a damaged cornea.

“Our first trial in four patients showed that CALEC was safe and the treatment was possible,” said principal investigator Ula Jurkunas, MD, associate director of the Cornea Service at Mass Eye and Ear and professor of Ophthalmology at Harvard Medical School. “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.”

Researchersshowed CALEC completely restored the cornea in 50% of participants at their 3-month visit and that rate of complete success increased to 79% and 77% at their 12- and 18-month visits, respectively. 

With two participants meeting the definition of partial success at 12 and 18 months, the overall success of CALEC was 93% and 92% at 12 and 18 months.  Three participants received a second CALEC transplant, one of whom reached complete success by the study end visit. An additional analysis of CALEC’s impact on vision showed varying levels of improvement of visual acuity in all 14 CALEC patients.

CALEC displayed a high safety profile, with no serious events occurring in either the donor or recipient eyes. One adverse event, a bacterial infection, occurred in one participant, eight months after the transplant due to chronic contact lens use. Other adverse events were minor and resolved quickly following the procedures.

CALEC remains an experimental procedure and is currently not offered at Mass Eye and Ear or any U.S. hospital, and additional studies will be needed before the treatment is submitted for federal approval.

The cornea is the clear, outermost layer of the eye. It’s outer border, the limbus, contains a large volume of healthy stem cells called limbal epithelial cells, which maintain the eye’s smooth surface. When a person suffers a cornea injury, such as a chemical burn, infection or other trauma, it can deplete the limbal epithelial cells, which can never regenerate. The resulting limbal stem cell deficiency renders the eye with a permanently damaged surface where it can’t undergo a corneal transplant, the current standard of care for vision rehabilitation. People with these injuries often experience persistent pain and visual difficulties.

This need led Jurkunas as a junior scientist and Dana, director of the Cornea Service at Mass Eye and Ear, to explore a new approach for regenerating limbal epithelial cells. Nearly two decades later, following preclinical studies and collaborations with researchers at Dana-Farber and Boston Children’s, it was possible to consistently manufacture CALEC grafts that met stringent quality criteria needed for human transplantation.

As an autologous therapy, one limitation of this approach is that it is necessary for the patient to have only one involved eye so a biopsy can be performed to get starting material from the unaffected normal eye.

“Our future hope is to set up an allogeneic manufacturing process starting with limbal stem cells from a normal cadaveric donor eye,” said Ritz “This will hopefully expand the use of this approach and make it possible to treat patients who have damage to both eyes.”

Source: Mass Eye and Ear

All in the Eyes: High Resolution Retinal Maps Aid Disease Diagnoses

Photoreceptor cells in the retina. Credit: Scientific Animations

Researchers have conducted one of the largest eye studies in the world to reveal new insights into retinal thickness, highlighting its potential in the early detection of diseases like type 2 diabetes, dementia and multiple sclerosis.

The WEHI-led study, using cutting-edge artificial intelligence technology to analyse over 50 000 eyes from the UK Biobank, producing maps of the retina in unprecedented detail to better understand how retinal differences link to various diseases.

The findings, published in Nature Communications, open up new possibilities for using routine eyecare imaging as a tool to screen for and manage diseases, much like mammograms have for breast cancer.

Unlocking a window into the brain

The retina is part of the central nervous system, which also comprises the brain and spinal cord. Many diseases are linked to degeneration or disruption of this critical system, including neurodegenerative conditions such as dementia and metabolic disorders like diabetes.

Globally, neurological conditions alone are one of the leading causes of disability and illness, with over 3 billion people, or 43% of the world’s population living with a brain related condition.

Lead researcher, WEHI’s Dr Vicki Jackson, said the findings broaden the horizons for using retinal imaging as a doorway into the central nervous system, to help manage disease.

“We’ve shown that retinal imaging can act as a window to the brain, by detecting associations with neurological disorders like multiple sclerosis and many other conditions,” said Dr Jackson, a statistician and gene expert.

“Our maps’ fine-scale measurements reveal critical new details about connections between retinal thinning and a range of common conditions.”

The study also identified new genetic factors that influence retinal thickness, which are likely to play a role in the growth and development of a person’s retina.

“This research underscores the potential for retinal thickness to act as a diagnostic biomarker to aid in detecting and tracking the progression of numerous diseases. We can now pinpoint specific locations of the retina which show key changes in some diseases.”

The international research team, led by WEHI, applied AI methods to big population data of retinal imaging and compared information about each person’s genetics and health to reveal unprecedented links to disease.

The results created 50 000 maps with measurements at over 29 000 locations across the retina, identifying retinal thinning relating to 294 genes that play an important role in disease.

AI fast-tracking the diagnostic future

Study lead and bioinformatician, Professor Melanie Bahlo AM, said past studies had indicated correlations between retinal thickness and disease, but her team’s AI-powered discoveries shed deeper light on the complex spatial anatomy of the retina and its role in disease.

“Technologies like AI fuel discovery, and when fused with brilliant minds, there is an extraordinary ability to transform big population data into far-reaching insights,” Prof Bahlo, a lab head at WEHI, said.

“There has never been a time in history where this powerful combination — technology, big data and brilliant minds — has come together to advance human health.”

The research reinforces the growing field of oculomics (using the eye to diagnose health conditions) as an emerging, powerful and non-invasive approach for predicting and diagnosing diseases.

Source: Walter and Eliza Hall Institute

Study Shows Effectiveness of Method to Stem Myopia

Photo by Ksenia Chernaya

Capping ten years of work to stem the tide of myopia, David Berntsen, Professor of Optometry at the University of Houston, is reporting that his team’s method to slow myopia not only works – but lasts.

The original Bifocal Lenses In Nearsighted Kids (BLINK) Study showed that having children with myopia wear high-add power multifocal contact lenses slows its progression. Now, new results from the BLINK2 Study, that continued following these children, found that the benefits continue even after the lenses are no longer used.

“We found that one year after discontinuing treatment with high-add power soft multifocal contact lenses in older teenagers, myopia progression returns to normal with no loss of treatment benefit,” reports Berntsen in JAMA Ophthalmology.

The study was funded by the National Institutes of Health’s National Eye Institute with collaborators from the Ohio State University College of Optometry.

In Focus: A Major Issue

Leading the team at the University of Houston, Berntsen takes on a significant challenge. By 2050 almost 50% of the world (5 billion people) will be myopic. Myopia is associated with an increased risk of long-term eye health problems that affect vision and can even lead to blindness.

From the initial study, high-add multifocal contact lenses were found to be effective at slowing the rate of eye growth, decreasing how myopic children became. Because higher amounts of myopia are associated with vision-threatening eye diseases later in life, like retinal detachment and glaucoma, controlling its progression during childhood potentially offers an additional future benefit.

“There has been concern that the eye might grow faster than normal when myopia control contact lenses are discontinued. Our findings show that when older teenagers stop wearing these myopia control lenses, the eye returns to the age-expected rate of growth,” said Berntsen.

“These follow-on results from the BLINK2 Study show that the treatment benefit with myopia control contact lenses have a durable benefit when they are discontinued at an older age,” said BLINK2 study chair, Jeffrey J. Walline, associate dean for research at the Ohio State University College of Optometry.

Eye Science

Myopia occurs when a child’s developing eyes grow too long from front to back. Instead of focusing images directly on the retina, they are focused at a point in front of the retina.

Single vision prescription glasses and contact lenses can correct myopic vision, but they fail to treat the underlying problem, which is the eye continuing to grow longer than normal. By contrast, soft multifocal contact lenses correct myopic vision in children while simultaneously slowing myopia progression by slowing eye growth.

Designed like a bullseye, multifocal contact lenses focus light in two basic ways. The centre portion of the lens corrects nearsightedness so that distance vision is clear, and it focuses light directly on the retina. The outer portion of the lens adds focusing power to bring the peripheral light into focus in front of the retina. Animal studies show that bringing light to focus in front of the retina may slow growth. The higher the reading power, the further in front of the retina it focuses peripheral light.

BLINK Once…Then Twice

In the original BLINK study, 294 myopic children, ages 7 to 11 years, were randomly assigned to wear single vision contact lenses or multifocal lenses with either high-add power (+2.50 diopters) or medium-add power (+1.50 diopters). They wore the lenses during the day as often as they could comfortably do so for three years. All participants were seen at clinics at the Ohio State University, Columbus, or at the University of Houston.

After three years in the original BLINK study, children in the high-add multifocal contact lens group had shorter eyes compared to the medium-add power and single-vision groups, and they also had the slowest rate of myopia progression and eye growth.

Of the original BLINK participants, 248 continued in BLINK2, during which all participants wore high-add (+2.50 diopters) lenses for two years, followed by single-vision contact lenses for the third year of the study to see if the benefit remained after discontinuing treatment.

At the end of BLINK2, axial eye growth returned to age-expected rates. While there was a small increase in eye growth of 0.03 mm/year across all age groups after discontinuing multifocal lenses, it is important to note that the overall rate of eye growth was no different than the age-expected rate. There was no evidence of faster than normal eye growth.

Participants who had been in the original BLINK high-add multifocal treatment group continued to have shorter eyes and less myopia at the end of BLINK2. Children who were switched to high-add multifocal contact lenses for the first time during BLINK2 did not catch up to those who had worn high-add lenses since the start of the BLINK Study when they were 7 to 11 years of age.

By contrast, studies of other myopia treatments, such as atropine drops and orthokeratology lenses that are designed to temporarily reshape the eye’s outermost corneal layer, showed a rebound effect (faster than age-normal eye growth) after treatment was discontinued.

“Our findings suggest that it’s a reasonable strategy to fit children with multifocal contact lenses for myopia control at a younger age and continue treatment until the late teenage years when myopia progression has slowed,” said Berntsen.

Source: University of Houston

New Potential Treatment for Inherited Blinding Disease Retinitis Pigmentosa

Researchers used a computer screening approach to identify two compounds that could help prevent vision loss in people with a genetic eye disease

Photoreceptor cells in the retina. Credit: Scientific Animations

Two new compounds may be able to treat retinitis pigmentosa, a group of inherited eye diseases that cause blindness. The compounds, described in a study published January 14th in the open-access journal PLOS Biology by Beata Jastrzebska from Case Western Reserve University, US, and colleagues, were identified using a virtual screening approach.

In retinitis pigmentosa, the retina protein rhodopsin is often misfolded due to genetic mutations, causing retinal cells to die off and leading to progressive blindness. Small molecules to correct rhodopsin folding are urgently needed to treat the estimated 100 000 people in the United States with the disease. Current experimental treatments include retinoid compounds, such as synthetic vitamin A derivatives, which are sensitive to light and can be toxic, leading to several drawbacks.

In the new study, researchers utilised virtual screening to search for new drug-like molecules that bind to and stabilise the structure of rhodopsin to improve its folding and movement through the cell. Two non-retinoid compounds were identified which met these criteria and had the ability to cross the blood-brain and blood-retina barriers. The team tested the compounds in the lab and showed that they improved cell surface expression of rhodopsin in 36 of 123 genetic subtypes of retinitis pigmentosa, including the most common one. Additionally, they protected against retinal degeneration in mice with retinitis pigmentosa.

“Importantly, treatment with either compound improved the overall retina health and function in these mice by prolonging the survival of their photoreceptors,” the authors say. However, they note that additional studies of the compounds or related compounds are needed before testing the treatments in humans.

The authors add, “Inherited mutations in the rhodopsin gene cause retinitis pigmentosa (RP), a progressive and currently untreatable blinding disease. This study identifies small molecule pharmacochaperones that suppress the pathogenic effects of various rhodopsin mutants in vitro and slow photoreceptor cell death in a mouse model of RP, offering a potential new therapeutic approach to prevent vision loss.”

Provided by PLOS

Is Paranoia Partly a Visual Problem?

Photo by Stormseeker on Unsplash

Could complex beliefs like paranoia have roots in something as basic as vision? A new Yale study finds evidence that they might. 

When completing a visual perception task, in which participants had to identify whether one moving dot was chasing another moving dot, those with greater tendencies toward paranoid thinking (believing others intend them harm) and teleological thinking (ascribing excessive meaning and purpose to events) performed worse than their counterparts, the study found. Those individuals more often – and confidently – claimed one dot was chasing the other when it wasn’t.

The findings, published in the journal Communications Psychology, suggest that, in the future, testing for illnesses like schizophrenia could be done with a simple eye test.

“We’re really interested in how the mind is organised,” said senior author Philip Corlett, an associate professor of psychiatry at Yale School of Medicine and member of the Wu Tsai Institute. “Chasing or other intentional behaviours are what you might think of as experiences perceived at a very high-level in the brain, that someone might have to reason through and deliberate. In this study, we can see them low down in the brain, in vision, which we think is exciting and interesting – and has implications for how those mechanisms might be relevant for schizophrenia.”

Paranoia and teleological thinking are similar in that they are both misattributions of intention, but paranoia is a negative perception while teleological thinking tends to be positive. Both patterns of thinking are linked to psychosis and schizophrenia.

Hallucinations are associated with psychosis as well and are often about other people, said Corlett, suggesting there may be a social component to these visual misperceptions.

“So we wondered whether there might be something related to social perception – or misperception, what we refer to as social hallucination – that we could measure and that relate to these symptoms of psychosis,” he said.

For the task, participants were shown dots moving on a screen. Sometimes one dot was chasing another; other times there was no chase. Across different trials of the task, participants had to say whether a chase was occurring or not.

Those with higher degrees of paranoia and teleological thinking (as measured through questionnaires) were more likely than others to say with confidence that a chase was happening when one wasn’t. Essentially, they perceived a social interaction that wasn’t occurring.

In additional experiments, the researchers asked participants to identify which dot was doing the chasing and which dot was being chased. In these results, paranoia and teleological thinking began to diverge.

“People with paranoia were particularly bad at detecting which dot was being chased,” said Santiago Castiello, lead author of the study and a postdoctoral researcher in Corlett’s lab. “And people with high teleology were particularly bad at detecting which dot was doing the chasing.”

That these two types of beliefs differed in this way highlights that they are distinct and may have implications for diagnosis or treatment, said the researchers. The connection to vision may also shift thinking around how the brain gives rise to psychotic symptoms.

“Very few people with congenital blindness develop schizophrenia,” said Castiello. “Finding these social hallucinations in vision makes me wonder if schizophrenia is something that develops through errors in how people sample the visual world.”

While there are no immediate therapeutic implications from these findings, deeper understanding of these beliefs could aid in pharmacological treatment development and risk assessment. 

“One thing we’re thinking about now is whether we can find eye tests that predict someone’s risk for psychosis,” said Corlett. “Maybe there is some very quick perceptual task that can identify when someone might need to talk to a clinician.”

Source: Yale University