Month: June 2025

Healing Spinal Cord Injuries with the Help of Electricity

Heals spinal cord injuries with the help of electricity. Researchers have developed an ultra-thin implant that can be placed directly on the spinal cord. The implant delivers a carefully controlled electrical current across the injured area. In a recent study, researchers were able to observe how the electrical field treatment led to improved recovery in rats with spinal cord injuries, and that the animals regained movement and sensation. Please note that the image shows a newer model of the implant used in the study. Photo and illustration: University of Auckland

Researchers at Chalmers University of Technology in Sweden and the University of Auckland in New Zealand have developed a groundbreaking bioelectric implant that restores movement in rats after injuries to the spinal cord.

This breakthrough, published in Nature Communications, offers new hope for an effective treatment for humans suffering from loss of sensation and function due to spinal cord injury.

Electricity stimulated nerve fibres to reconnect

Before birth, and to a lesser extent afterwards, naturally occurring electric fields play a vital role in early nervous system development, encouraging and guiding the growth of nerve fibres along the spinal cord. Scientists are now harnessing this same electrical guidance system in the lab.

“We developed an ultra-thin implant designed to sit directly on the spinal cord, precisely positioned over the injury site in rats,” says Bruce Harland, senior research fellow, University of Auckland, and one of the lead researchers of the study.

The device delivers a carefully controlled electrical current across the injury site.

“The aim is to stimulate healing so people can recover functions lost through spinal cord injury,” says Professor Darren Svirskis, University of Auckland, Maria Asplund, Professor of bioelectronics at Chalmers University of Technology.

She is, together with Darren Svirskis, University of Auckland,

In the study, researchers observed how electrical field treatment improved the recovery of locomotion and sensation in rats with spinal cord injury. The findings offer renewed hope for individuals experiencing loss of function and sensation due to spinal cord injuries.

“Long-term, the goal is to transform this technology into a medical device that could benefit people living with life-changing spinal-cord injuries,” says Maria Asplund.

The study presents the first use of a thin implant that delivers stimulation in direct contact with the spinal cord, marking a groundbreaking advancement in the precision of spinal cord stimulation.

“This study offers an exciting proof of concept showing that electric field treatment can support recovery after spinal cord injury,” says doctoral student Lukas Matter, Chalmers University of Technology, the other lead researcher alongside Harland.

Improved mobility after four weeks

Unlike humans, rats have a greater capacity for spontaneous recovery after spinal cord injury, which allowed researchers to compare natural healing with healing supported by electrical stimulation.

After four weeks, animals that received daily electric field treatment showed improved movement compared with those who did not. Throughout the 12-week study, they responded more quickly to gentle touch.

“This indicates that the treatment supported recovery of both movement and sensation,” Harland says.

“Just as importantly, our analysis confirmed that the treatment did not cause inflammation or other damage to the spinal cord, demonstrating that it was not only effective but also safe,” Svirskis says.

The next step is to explore how different doses, including the strength, frequency, and duration of the treatment, affect recovery, to discover the most effective recipe for spinal-cord repair.

Source: Chalmers University of Technology

Glucose Tolerance Levels Linked to Increased Life Expectancy

Photo by Photomix Company on Pexels

It is well known that preventing the onset of diabetes reduces the risk of death, and that managing blood glucose levels is key to preventing diabetes. However, it remains unclear whether there are specific ranges within “normal” blood glucose levels that are associated with even lower mortality risks.

A small farming community in the Tohoku Region of northern Japan has possibly provided researchers with further insights.

For over 40 years, the Ohasama Study, named after a town in Iwate Prefecture, has tracked the long-term health of Ohasama’s local population. Since 1986, the study has collected health and medical data through regular checkups and tests.

As part of the study, participants have undergone a glucose tolerance test every four years. This test, which is commonly used to diagnose diabetes, measures blood glucose levels before and 120 minutes after drinking a glucose-containing beverage.

Now, a research team has analysed the glucose tolerance test data from the Ohasama Study, publishing their findings in the journal PNAS Nexus. Junta Imai and Hideki Katagiri from Tohoku University led the study.

“We first examined the relationship between mortality and various health test results, including but not limited to glucose tolerance tests, for 993 individuals,” explains Imai. “Even after adjusting for known risk factors such as age, obesity, and smoking, the one-hour post-glucose load blood glucose level showed a strong correlation with mortality.”

Participants were then divided into two groups based on the median one-hour post-glucose load blood glucose level of 162mg/dL. The survival analysis showed significantly better outcomes in the lower-glucose group.

Since some participants had already developed diabetes, the researchers narrowed their focus to 595 individuals with normal glucose tolerance. They analysed which glucose threshold had the strongest correlation with mortality and found that 170mg/dL was the most predictive.

Using this threshold, Imai and his colleagues conducted a survival analysis, comparing those with post-glucose load blood glucose levels below and above 170mg/dL. After 20 years, nearly 80% of the under-170mg/dL group were still alive, while almost 50% of the over-170mg/dL group had died – a statistically significant result.

Further analysis of cause-of-death data revealed that individuals with one-hour post-glucose load blood glucose levels under 170mg/dL had significantly fewer deaths due to heart disease caused by atherosclerosis (p < 0.0001) and malignant tumours (p < 0.0014) compared to those with higher levels.

“These findings demonstrate that even within the range considered ‘normal,’ there is a subset of blood glucose levels associated with a lower risk of death,” adds Imai. “Besides taking measures to prevent diabetes, greater efforts towards managing blood glucose spikes shortly after eating could help prevent heart disease and cancer, ultimately leading to longer, healthier lives.”

Source: Tohoku University

Handheld TB Test for Low-cost Diagnosis in HIV Hotspots

A new handheld tuberculosis testing device by Tulane University is the size of a credit card, requires no electricity and significantly improves detection of the disease in those with HIV. (Vincent Postle/Tulane University)

Current tuberculosis infection tests struggle to detect the disease in those with HIV. A common co-infection, HIV can hide TB from traditional tests by eliminating the immune cells relied upon to sound the alarm.

While more than 90% of the 2 billion TB cases worldwide are latent – symptom-free and not contagious – the weakening of the immune system in those with HIV can allow latent TB to turn active, increasing the potential for new infections to spread and often resulting in fatal outcomes. Tuberculosis is the leading cause of death among those with HIV worldwide.

Now, Tulane University researchers have developed a new handheld TB test that significantly improves detection in people with HIV, according to a new study in Nature Biomedical Engineering. Powered by a beetle-inspired chemical reaction, the device requires no electricity and addresses a critical gap in TB infection detection that has long hobbled efforts to eliminate the world’s deadliest infectious disease. 

“The goal was to develop a TB test that could be taken anywhere and provide quicker, more accurate results for anybody.”

Tony Hu, PhD

Dubbed the ASTRA (Antigen-Specific T-cell Response Assay), the credit card-sized device requires only a drop of blood to provide same day diagnoses without need for a laboratory or trained staff. When tested against the traditional IGRA blood test (Interferon-Gamma Release Assay), the ASTRA detected TB in HIV-infected individuals with 87% specificity compared to IGRA’s 60%, while also outperforming in detection of TB without HIV co-infection. 

“The goal was to develop a TB test that could be taken anywhere and provide quicker, more accurate results for anybody,” said senior author Tony Hu, PhD, Chair in Biotechnology Innovation at Tulane University and director of the Tulane Center for Cellular & Molecular Diagnostics. “Current tests such as the IGRA are cost-prohibitive or require access to facilities that resource-limited communities don’t have. If we are going to eliminate TB, we have to diagnose and treat as many infection cases as possible.”

Added Bo Ning, lead author and assistant professor of biochemistry at Tulane University School of Medicine: “If your community has an immunocompromised population, someone may have latent TB.  This can help block the spread of TB and ensure that no one slips through the cracks.” 

To create a test that would not be stymied by HIV, the researchers identified two new biomarkers that could detect TB without relying on the immune cells susceptible to the virus. 

After adding a drop of blood to the device, it must incubate for 4 hours to allow a preloaded reagent to stimulate a response from the immune cells. The reagent acts as a “wanted poster” asking if they’ve seen tuberculosis bacteria before. 

To avoid the use of electricity, the researchers looked to an unlikely source for inspiration: the bombardier beetle. When threatened, these large insects combine two chemicals, and the resulting reaction produces a forceful spray. Similarly, two chemicals in the ASTRA are combined to propel the sample across a chip for final analysis and diagnosis. 

The new device delivers results in about 4 hours, compared to the IGRA, which takes 24 hours, and a common TB skin test, which can take between two and three days for a diagnosis. 

The ASTRA’s performance was validated using samples collected from a cohort in Eswatini, a country with high TB incidence and the highest reported HIV prevalence (27.3%) worldwide.

Increasing testing accuracy, access and speed is even more vital as TB resistance to drugs grows more robust, Hu said. 

“The sooner you have a diagnosis, the sooner you can begin the process of determining proper treatment,” Hu said. “TB is the No. 1 pathogen HIV patients worry about globally. If treatment is available, we should be working to kill these bacteria, latent or not.”

Source: Tulane University

RSV Vaccine Reduces the Risk of Dementia, New Research Shows

Photo by Mika Baumeister on Unsplash

A new study by the University of Oxford, published in the journal npj Vaccines, shows that a vaccine against respiratory syncytial virus (RSV) is associated with a 29% reduction in dementia risk in the following 18 months. The findings suggest a novel explanation for how vaccines produce this effect.

Recent studies have shown convincingly that vaccines against shingles (Herpes zoster) reduce the risk of dementia. The shingles vaccine now in widespread use (Shingrix) has more of an effect than the previous one (Zostavax). A key difference between these vaccines is that Shingrix contains an ‘adjuvant’, an ingredient designed to enhance the vaccine’s effect. It is therefore possible that the adjuvant contributes to Shingrix’ greater effect than Zostavax on reducing dementia.

The new study, supported by the National Institute for Health and Care Research (NIHR) Oxford Health Biomedical Research Centre (OH BRC), supports this possibility. Researchers analysed the health records of over 430 000 people in the USA in the TriNetX network. They found that the Arexvy vaccine – which protects against respiratory syncytial virus (RSV), a common virus that causes cold-like symptoms – was also linked to a significantly lower risk of developing dementia. Arexvy, now offered to adults over 60, contains the same adjuvant as Shingrix. Both vaccines were similarly effective in reducing dementia risk compared to the flu vaccine (which does not contain the adjuvant); in the 18 months following receipt of Arexvy there was a 29% reduction in diagnoses of dementia. These findings held true across a range of additional analyses and were similar in men and women.

It is not clear how the adjuvant, called AS01, might help lower the risk of dementia. However, laboratory studies show that AS01 stimulates cells of the immune system that could help protect the brain from some of the harmful processes underlying dementia. These benefits of the adjuvant in reducing dementia risk could be in addition to the protection that comes from preventing infections like shingles and RSV themselves.

It is not yet known whether these vaccines prevent dementia or, more likely, delay its onset. Either way, the effect is significant, especially given that no other treatments are known that delay or prevent the condition.

The likely beneficial effect on dementia risk is in addition to the vaccines’ proven ability to prevent shingles and RSV, both of which are unpleasant and sometimes serious illnesses.

Lead author, Associate Professor Maxime Taquet, NIHR Academic Clinical Lecturer, Department of Psychiatry, University of Oxford, said: “Our findings show that vaccines against two separate viruses, shingles and RSV, both lead to reductions in dementia. This gives another reason to have the vaccines, in addition to their effectiveness at preventing these serious illnesses.’

Senior author, Professor Paul Harrison, Department of Psychiatry, University of Oxford and Co-Lead for the Molecular Targets theme in OH BRC, said: ‘The findings are striking. We need studies to confirm whether the adjuvant present in some vaccines contributes to the reduced dementia risk, and to understand how it does so.’

Source: Oxford University

New Clues Towards the Treatment of Myeloproliferative Neoplasms

Scanning electron micrograph of red blood cells, T cells (orange) and platelets (green). Source: Wikimedia CC0

Northwestern Medicine scientists have uncovered key details about a group of rare but serious blood disorders, which may help inform potential treatments, according to a study published in the Journal of Clinical Investigation.  

Myeloproliferative neoplasms (MPNs) are rare blood cancers characterised by the abnormal growth of blood cells. They have long been linked to a key signalling pathway called JAK2/STAT, but the specific details of how they develop have remained unclear. 

“These diseases are often driven by abnormal activation of a protein called JAK2,” said Peng Ji, MD, PhD, ‘15 GME, Professor of Pathology and senior author of the study. “In earlier research, we discovered that another protein, PLEK2, acts downstream of JAK2 and plays a critical role in mediating JAK2’s effects, helping to drive the progression of MPNs.” 

In the current study, Ji and his collaborators aimed to better understand the proteins that work alongside PLEK2, also known as the PLEK2 “signalosome.” 

By analysing protein expression in cultured human blood stem cells, the investigators identified a new contributor, PPIL2, that appears to help cancer cells grow by disabling the tumour suppressor protein p53. 

Under normal conditions, p53 works as a tumour suppressor protein that prevents excessive cell growth. PPIL2 effectively marks p53 for degradation, weakening its ability to control cell growth and allowing the disease to advance, according to the findings. 

Investigators found that blocking PPIL2 using cyclosporin A, an immunosuppressant drug commonly used for organ transplant patients, led to an increase in p53 levels, restoring its ability to regulate cell growth. In experiments using MPN models — including mice with a mutated JAK2 gene and lab-grown human bone marrow — cyclosporin A significantly reduced the abnormal proliferation of blood cells, according to the findings. 

“Even better results were seen when cyclosporin A was combined with another type of drug that also boosts p53,” said Ji. “This shows that targeting PPIL2 might be a powerful new way to treat MPNs using drugs that are already available.” 

While more research is needed to fully understand how cyclosporin A works in MPN patients, this study highlights a promising new target for treatment, Ji said.  

Now, Ji and his laboratory are planning to work on developing drugs that more specifically block PPIL2, since cyclosporin A affects many proteins and can have unwanted effects.  

“Clinical studies will be needed to test whether this approach works in people, possibly starting by looking at how MPN patients respond to cyclosporin A if they’ve already been treated with it,” Ji said.  

Source: Northwestern University

Early Anticoagulants Found to be Safe and Effective for AF Stroke Patients

Ischaemic and haemorrhagic stroke. Credit: Scientific Animations CC4.0

Patients with atrial fibrillation who have experienced a stroke would benefit greatly from earlier treatment than is currently recommended in UK guidelines, finds a new study led by UCL researchers.

The results of the CATALYST study, published in The Lancet, included data from four randomised trials with a total of 5441 patients across the UK, Switzerland, Sweden and the United States, who had all experienced a recent stroke (between 2017-2024) due to a blocked artery and atrial fibrillation (irregular heartbeat).

Patients had either started medication early (within four days of their stroke) or later (after five days or more).

The researchers found that starting direct oral anticoagulants (DOACs, which thin the blood to prevent it from clotting as quickly) within four days of having a stroke was safe, with no increase in bleeding into the brain. Additionally, early initiation of treatment significantly reduced the risk of another stroke due to bleeding or artery blockage by 30% compared to those who started treatment later.

People with atrial fibrillation who have had a stroke have an increased risk of having another stroke, but this risk can be reduced by taking anticoagulants.

Anticoagulants come with the rare but dangerous side effect of bleeding into the brain, and there is a lack of evidence about when is best to start taking them after a stroke. Current UK guidelines are varied, suggesting that those who have had a moderate or severe stroke should wait at least five days before starting blood-thinning treatments.

To tackle this question, the researchers investigated the impact of early compared to delayed anticoagulant treatment.

Chief Investigator, Professor David Werring (UCL Queen Square Institute of Neurology) said: “Our new study supports the early initiation of DOACs in clinical practice, offering better protection against further strokes for a wide range of patients.”

The researchers now hope that their findings will influence clinical guidelines and improve outcomes for stroke patients worldwide.

First author and main statistician, Dr Hakim-Moulay Dehbi (UCL Comprehensive Clinical Trials Unit), said: “By systematically combining the data from four clinical trials, we have identified with increased confidence, compared to the individual trials, that early DOAC initiation is effective.”

The CATALYST study builds on findings from the British Heart Foundation funded OPTIMAS study – where the UCL-led research team analysed 3621 patients with atrial fibrillation who had had a stroke between 2019 and 2024, across 100 UK hospitals.

Half of the participants began anticoagulant treatment within four days of their stroke (early), and the other half started treatment seven to 14 days after having a stroke (delayed). Patients were followed up after 90 days to assess several outcomes including whether they went on to have another stroke and whether they experienced bleeding in the brain.

Both the early and late groups experienced a similar number of recurrent strokes. Early treatment was found to be effective and did not increase the risk of a bleed into the brain.

Professor Nick Freemantle, Senior Investigator and Director of the UCL Comprehensive Clinical Trials Unit (CCTU) that co-ordinated the OPTIMAS trial, said: “The benefits of early initiation of blood-thinning treatment are clear: patients receive the definitive and effective long-term stroke prevention therapy promptly, rather than waiting. This approach ensures that crucial treatments are not delayed or missed, particularly for patients who are discharged from the hospital.”

Study limitations

The timing for starting blood-thinning medication was based on previous trial designs (such as OPTIMAS), which may not cover all possible scenarios. Additionally, not all participants were randomised to the same timing groups, so some data was excluded. Lastly, the study didn’t include many patients with very severe strokes, so the findings might not apply to those cases.

Source: University College London

Inflammatory Cells Remain in the Blood After Treatment of Severe Asthma with Biologics

Photo by Cnordic Nordic

Biological drugs have improved the lives of many people with severe asthma. However, a new study from Karolinska Institutet shows that some immune cells with high inflammatory potential are not completely eradicated after treatment.

Biological drugs have become an important tool in the treatment of severe asthma. 

“They help most patients to keep their symptoms under control, but exactly how these drugs affect the immune system has so far remained unknown,” says Valentyna Yasinska, consultant in pulmonary medicine at Karolinska University Hospital and doctoral student at Karolinska Institutet’s Department of Medicine in Huddinge.

Increased in blood

In a new study published in the scientific journal Allergy, researchers at Karolinska Institutet have explored what happens to the immune cells of patients being treated with biologics. By analysing blood samples from 40 patients before and during treatment, they found that instead of disappearing during treatment, certain types of immune cell – which play a key part in asthma inflammation – actually increased.

“This suggests that biologics might not attack the root of the problem, no matter how much they help asthma patients during treatment,” says Jenny Mjösberg, professor of tissue immunology at Karolinska Institutet’s Department of Medicine in Huddinge. “Continued treatment might be necessary to keep the disease under control.”

Surprising finding

The study is based on data from patients with severe asthma sourced from the BIOCROSS study. The researchers used advanced methods such as flow cytometry and single-cell sequencing to determine the properties and function of the immune cells.

“We were surprised to find that blood levels of inflammatory cells increased rather than decreased,” says Lorenz Wirth, doctoral student at the same department at Karolinska Institutet. “This could explain why inflammation of the airways often returns when the treatment is tapered or discontinued. It is important that we understand the long-term immunological effects of these drugs.”

Relatively new drugs

Little is still known about the long-term effects of biologics like mepolizumab and dupilumab since they are relatively new, having been prescribed to asthmatics for less than ten years. 

The next stage of the study will be to analyse samples from patients with a long treatment history and to study lung tissue to see how the immune cells are affected in the airways.

Source: Karolinska Instutet

Bacterial Stress Drives the Development of Antibiotic Resistance

Photo by CDC on Unsplash

Antibiotics are supposed to wipe out bacteria, yet the drugs can sometimes hand microbes an unexpected advantage.

A new Nature Communications study from Rutgers Health shows that ciprofloxacin, a staple treatment for urinary tract infections, throws Escherichia coli into an energy crisis that saves many cells from death and speeds the evolution of full‑blown resistance.

“Antibiotics can actually change bacterial metabolism,” said first author Barry Li, a student at Rutgers pursuing a dual doctoral degree for physician–scientists. “We wanted to see what those changes do to the bugs’ chances of survival.”

Li and senior author Jason Yang focused on adenosine triphosphate (ATP), the molecular fuel of cells. When ATP levels crash, cells experience “bioenergetic stress.” To mimic that stress, the team engineered E. coli with genetic drains that constantly burned ATP or its cousin nicotinamide adenine dinucleotide (NADH). Then, they pitted both the engineered strains and normal bacteria against ciprofloxacin.

The results surprised the researchers. The drug and the genetic drains each slashed ATP, but rather than slowing down, the bacteria revved up. Respiration soared, and the cells spewed extra reactive‑oxygen molecules that can damage DNA. That frenzy produced two troubling outcomes.

First, more of the bacteria cells survived.

In time‑kill tests, ten times as many stressed cells survived a lethal ciprofloxacin dose compared with unstressed controls. These hardy stragglers, called persister cells, lie low until the drug is gone and then rebound to launch a new infection.

People have long blamed sluggish metabolism for persister cell formation.

“People expected a slower metabolism to cause less killing,” Li said. “We saw the opposite. The cells ramp up metabolism to refill their energy tanks, and that turns on stress responses that slow the killing.”

Follow‑up experiments traced the protection to the stringent response, a bacterial alarm system that reprograms the cell under stress.

Second, stressed cells mutated faster to evolve antibiotic resistance.

While persisters keep infections smoldering, genetic resistance can render a drug useless outright. The Rutgers group cycled E. coli through escalating ciprofloxacin doses and found that stressed cells reached the resistance threshold four rounds sooner than normal cells. DNA sequencing and classic mutation tests pointed to oxidative damage and error‑prone repair as the culprits.

“The changes in metabolism are making antibiotics work less well and helping bacteria evolve resistance,” said Yang, an assistant professor at the medical school and Chancellor Scholar of microbiology, biochemistry & molecular genetics.

Preliminary measurements show that gentamicin and ampicillin also drain ATP in addition to ciprofloxacin. The stress effect may span very different pathogens, including the pathogen Mycobacterium tuberculosis, which is highly sensitive to ATP shocks.

If so, the discovery casts new light on a global threat. Antibiotic resistance already contributes to 1.27 million deaths a year. Strategies that ignore the metabolic fallout of treatment may be missing a key lever.

The findings suggest several changes for antibiotic development and use.

First, screen candidate antibiotics for unintended energy‑drain side effects. Second, pair existing drugs with anti‑evolution boosters that block the stress pathways or mop up the extra oxygen radicals. Third, reconsider the instinct to blast infections with the highest possible dose. Earlier studies and the new data both hint that extreme concentrations can trigger the very stress that protects bacteria.

“Bacteria turn our attack into a training camp,” Yang said. “If we can cut the power to that camp, we can keep our antibiotics working longer.”

Li and Yang are planning on testing compounds that soothe bioenergetic stress in the hope of turning the microbial energy crisis back into an Achilles’ heel rather than a shield.

Source: Rutgers University

Why Caffeine Might Hold the Key to Preventing Sudden Infant Death Syndrome

Photo by William Fortunato on Pexels

After decades of stalled national progress in reducing the rate of Sudden Unexpected Infant Death (SUID), a category of infant mortality that includes sudden infant death syndrome (SIDS), researchers at Rutgers Health have proposed an unexpected solution: Caffeine might protect babies by preventing dangerous drops in oxygen that may trigger deaths.

The hypothesis, published in the Journal of Perinatology, comes as the number of SUID cases has plateaued in the US at about 3500 deaths a year for 25 years or one death for every 1000 live births. Despite an initial decline in the 1990s with the introduction of widespread education campaigns promoting back to sleep and other safe infant sleep recommendations by the American Academy of Pediatrics, SIDS, even on its own, remains the leading cause of death in infants between 1 and 12 months old.

“We’ve been concerned about why the rates haven’t changed,” said Thomas Hegyi, a neonatologist at Rutgers Robert Wood Johnson Medical School who led the research. “So, we wanted to explore new ways of approaching the challenge.”

That approach led Hegyi and Ostfeld to a striking realisation: Virtually all known risk factors for SIDS and other sleep-related infant deaths, from stomach sleeping to maternal smoking to bed-sharing to preterm birth, have one thing in common. They are all associated with intermittent hypoxia, brief episodes where oxygen levels drop below 80%.

“I wondered, what can counter intermittent hypoxia?” Hegyi said. “Caffeine.”

The connection isn’t entirely theoretical. Neonatologists already use caffeine to treat apnoea in premature infants, where it works as a respiratory stimulant. The drug has an excellent safety profile in babies, with minimal side effects even at high doses.

What makes caffeine particularly intriguing as a proposed preventive measure is how differently infants process it. While adults metabolise caffeine in about four hours, the half-life in newborns can be as long as 100 hours. Caffeine remains in an infant’s system for weeks, not hours.

This unique metabolism might explain a long-standing puzzle: why SIDS peaks between two and four months of age. As infants mature, they begin metabolising caffeine more quickly. The researchers suggest caffeine consumed during pregnancy or passed through breast milk might provide early protection that wanes as metabolism speeds up.

The theory also could explain why breastfeeding appears to protect against SIDS.

“We hypothesize that the protection afforded by breast milk is, in part, due to caffeine,” wrote the researchers, noting caffeine readily passes from mothers to infants through breast milk.

Barbara Ostfeld, a professor at Rutgers Robert Wood Johnson Medical School, the programme director of the SIDS Center of New Jersey and co-author of the paper, said if the theory proves true, any efforts to give infants caffeine would complement, not replace, existing risk reduction strategies.

“The idea isn’t that caffeine will replace risk-reduction behaviours,” Ostfeld said. “A baby dying from accidental suffocation, one component of SUID, is not likely to have benefited from caffeine but would have from such safe sleep practices as the elimination of pillows and other loose bedding from the infant’s sleep environment.”

The researchers plan to test their hypothesis by comparing caffeine levels in infants who died of SIDS with those who died from other causes, such as trauma or disease.

The research represents a fundamental shift in approaching SIDS prevention. While current strategies focus on eliminating environmental risks, this would be the first potential pharmaceutical intervention.

“For over 30 years, we’ve been educating New Jersey’s parents about adopting safe infant sleep practices.  These efforts have contributed to our state rates being the second lowest in the US.  Still, for various reasons, these proven recommendations are not universally adopted,” Ostfeld said. “This new hypothesis offers a way not just to address important risk factors but potentially intervene.”

Crucially, the researchers said this is hypothesis-generating research meant to inspire further study, not a recommendation for parents to give their babies caffeine. Any intervention would require extensive testing for safety and efficacy.

Still, in a field where progress has stagnated for decades, the possibility of a new approach offers hope.

As Hegyi put it, the goal is “to stimulate new thinking about a problem that has remained unchanged for 25 years.”

Source: Rutgers University

Boosting Apolipoprotein-M May Block Age-related Macular Degeneration

Retina showing reticular pseudodrusen. Although they can infrequently appear in individuals with no other apparent pathology, their highest rates of occurrence are in association with age-related macular degeneration (AMD), for which they hold clinical significance by being highly correlated with end-stage disease sub-types, choroidal neovascularisation and geographic atrophy. Credit: National Eye Institute

A new study from Washington University School of Medicine in St. Louis identifies a possible way to slow or block progression of age-related macular degeneration, a leading cause of blindness in people over age 50. The WashU Medicine researchers and their international collaborators implicated problems with cholesterol metabolism in this type of vision loss, perhaps helping to explain the links between macular degeneration and cardiovascular disease, which both worsen with age.

The new findings, using human plasma samples and mouse models of macular degeneration, suggest that increasing the amount of a molecule called apolipoprotein M (ApoM) in the blood fixes problems in cholesterol processing that lead to cellular damage in the eyes and other organs. Various methods of dialing up ApoM could serve as new treatment strategies for age-related macular degeneration and perhaps some forms of heart failure triggered by similar dysfunctional cholesterol processing.

The study appears June 24 in the journal Nature Communications.

“Our study points to a possible way to address a major unmet clinical need,” said senior author Rajendra S. Apte, MD, PhD, professor of ophthalmology and visual sciences at WashU Medicine. “Current therapies that reduce the chance of further vision loss are limited to only the most advanced stages of macular degeneration and do not reverse the disease. Our findings suggest that developing treatments that increase ApoM levels could treat or even prevent the disease and therefore preserve people’s vision as they age.”

In macular degeneration, doctors can see cholesterol-rich deposits under the retina during an eye exam, according to Apte. In early stages, vision might still be normal, but the deposits increase inflammation and other damaging processes the lead to the gradual loss of central vision. In the most common type, “dry” macular degeneration, the cells in the central part of the retina can be damaged, causing a type of neurodegeneration called geographic atrophy, which is similar to what happens in the brain in conditions such as Alzheimer’s disease. Dry macular degeneration can turn into “wet” macular degeneration, in which abnormal blood vessel growth damages vision.

Geographic atrophy and wet macular degeneration are advanced forms of the disease that are accompanied by vision loss. Although some approved therapies for advanced disease are available, the disease process itself is not reversible at that stage.

A common culprit in eye disease and heart failure

In recent years, evidence has emerged that ApoM can serve as a protective molecule with known anti-inflammatory effects and roles in maintaining healthy cholesterol metabolism. With that in mind, Apte and co-senior author Ali Javaheri, MD, PhD, an assistant professor of medicine, were interested in assessing whether reduced ApoM levels, which fall with age, could be involved in the dysfunctional cholesterol metabolism that is at the root of multiple diseases of aging, including macular degeneration and heart disease. They showed that patients with macular degeneration have reduced levels of ApoM circulating in the blood compared with healthy controls. And past work by Javaheri, a WashU Medicine cardiologist, showed that patients with various forms of heart failure also had reduced levels of ApoM in the blood.

This study revealed that ApoM is a key component in the “good cholesterol” pathways that mop up excess “bad” LDL cholesterol and excrete it via the liver.

Apte and Javaheri’s research suggests that when ApoM is low, cells in the retina and heart muscle can’t correctly metabolise cholesterol deposits and struggle to clear these accumulating lipids. When these lipids build up, it leads to inflammation and cellular damage.

To see if they could reverse the harmful effects of low ApoM, the researchers increased ApoM levels in mouse models of macular degeneration, using genetic modification or plasma transfer from other mice. The mice showed evidence of improved retinal health, improved function of light-sensing cells in the retina and reduced accumulation of cholesterol deposits. The researchers further found evidence that ApoM triggers a signalling pathway that breaks down the cholesterol in cellular compartments called lysosomes, which are known for playing important roles in disposing of cellular waste.

The researchers also found that ApoM must be bound to a molecule called sphingosine-1-phosphate (S1P) to get the beneficial effects of ApoM treatment in the mice.

The findings also could have implications for future interventions that raise ApoM in patients with heart failure.

“One of the exciting things about this collaboration is realising the links between retinal pigment epithelial cells and heart muscle cell, which are both vulnerable to low ApoM,” Javeheri said. “It is possible that the interaction between ApoM and S1P is regulating cholesterol metabolism in both cell types. We look forward to exploring strategies to increase ApoM in ways that could help the eye and the heart maintain healthy cholesterol metabolism over time and stave off two major diseases of aging.”

Source: WashU Medicine