Month: July 2025

People with ‘Young Brains’ Outlive ‘Old-brained’ Peers, Research Finds

Image created with Gencraft AI

A blood-test analysis developed at Stanford Medicine can determine the “biological ages” of 11 separate organ systems in individuals’ bodies and predict the health consequences.

Beside our chronological age, research has shown that we also have what’s called a “biological age,” a cryptic but more accurate measure of our physiological condition and likelihood of developing aging-associated disorders from heart trouble to Alzheimer’s disease.

How old someone’s internal organs are is a challenge to determine compared to looking at wrinkles and greying hair. Internal organs are ageing at different speeds, too, according to a new study by Stanford Medicine investigators.

“We’ve developed a blood-based indicator of the age of your organs,” said Tony Wyss-Coray, PhD, professor of neurology and neurological sciences and director of the Knight Initiative for Brain Resilience at the Wu Tsai Neurosciences Institute. “With this indicator, we can assess the age of an organ today and predict the odds of your getting a disease associated with that organ 10 years later.”

They can even predict who is most likely to die from medical conditions associated with one or more of the 11 separate organ systems the researchers looked at: brain, muscle, heart, lung, arteries, liver, kidneys, pancreas, immune system, intestine and fat.

The brain is the gatekeeper of longevity. If you’ve got an old brain, you have an increased likelihood of mortality. If you’ve got a young brain, you’re probably going to live longer.”

The biological age of one organ, the brain, plays an outsized role in determining how long you have left to live, Wyss-Coray said.

“The brain is the gatekeeper of longevity,” he said. “If you’ve got an old brain, you have an increased likelihood of mortality. If you’ve got a young brain, you’re probably going to live longer.”

Wyss-Coray is the senior author of the study, published online July 9 in Nature Medicine. The lead author is Hamilton Oh, PhD, a former graduate student in Wyss-Coray’s group.

Eleven organ systems, 3000 proteins, 45 000 people

The scientists used 44 498 randomly selected participants, ages 40 to 70, who were drawn from the UK Biobank. This ongoing effort has collected multiple blood samples and updated medical reports from some 600 000 individuals over several years. These participants were monitored for up to 17 years for changes in their health status.

Wyss-Coray’s team made use of an advanced commercially available laboratory technology that counted the amounts of nearly 3000 proteins in each participant’s blood. Some 15% of these proteins can be traced to single-organ origins, and many of the others to multiple-organ generation.

The researchers fed everybody’s blood-borne protein levels into a computer and determined the average levels of each of those organ-specific proteins in the blood of those people’s bodies, adjusted for age. From this, the scientists generated an algorithm that found how much the composite protein “signature” for each organ being assessed differed from the overall average for people of that age.

Based on the differences between individuals’ and age-adjusted average organ-assigned protein levels, the algorithm assigned a biological age to each of the 11 distinct organs or organ systems assessed for each subject. And it measured how far each organ’s multiprotein signature in any given individual deviated in either direction from the average for people of the same chronological age. These protein signatures served as proxies for individual organs’ relative biological condition. A greater than 1.5 standard deviation from the average put a person’s organ in the “extremely aged” or “extremely youthful” category.

One-third of the individuals in the study had at least one organ with a 1.5-or-greater standard deviation from the average, with the investigators designating any such organ as “extremely aged” or “extremely youthful.” One in four participants had multiple extremely aged or youthful organs.

For the brain, “extremely aged” translated to being among the 6% to 7% of study participants’ brains whose protein signatures fell at one end of the biological-age distribution. “Extremely youthful” brains fell into the 6% to 7% at the opposite end.

Health outcomes foretold

The algorithm also predicted people’s future health, organ by organ, based on their current organs’ biological age. Wyss-Coray and his colleagues checked for associations between extremely aged organs and any of 15 different disorders including Alzheimer’s and Parkinson’s diseases, chronic liver or kidney disease, Type 2 diabetes, two different heart conditions and two different lung diseases, rheumatoid arthritis and osteoarthritis, and more.

Risks for several of those diseases were affected by numerous different organs’ biological age. But the strongest associations were between an individual’s biologically aged organ and the chance that this individual would develop a disease associated with that organ. For example, having an extremely aged heart predicted higher risk of atrial fibrillation or heart failure, having aged lungs predicted heightened chronic obstructive pulmonary disease (COPD) risk, and having an old brain predicted higher risk for Alzheimer’s disease.

The association between having an extremely aged brain and developing Alzheimer’s disease was particularly powerful: 3.1 times that of a person with a normally aging brain. Meanwhile, having an extremely youthful brain was especially protective against Alzheimer’s – barely one-fourth that of a person with a normally aged brain.

In addition, Wyss-Coray said, brain age was the best single predictor of overall mortality. Having an extremely aged brain increased subjects’ risk of dying by 182% over a roughly 15-year period, while individuals with extremely youthful brains had an overall 40% reduction in their risk of dying over the same duration.

Predicting the disease, then preventing it

“This approach could lead to human experiments testing new longevity interventions for their effects on the biological ages of individual organs in individual people,” Wyss-Coray said.

Medical researchers may, for example, be able to use extreme brain age as a proxy for impending Alzheimer’s disease and intervene before the onset of outward symptoms, when there’s still time to arrest it, he said.

Careful collection of lifestyle, diet and prescribed- or supplemental-substance intake in clinical trials, combined with organ-age assessments, could throw light on the medical value of those factors’ contributions to the aging of various organs, as well as on whether existing, approved drugs can restore organ youth before people develop a disease for which an organ’s advanced biological age puts them at high risk, Wyss-Coray added.

If commercialised, the test could be available in the next two to three years, Wyss-Coray said. “The cost will come down as we focus on fewer key organs, such as the brain, heart and immune system, to get more resolution and stronger links to specific diseases.”

Source: Stanford University

Study Reveals the Hidden Benefits of Weight Loss on Fat Tissue

Photo by Andres Ayrton on Pexels

Scientists have produced the first detailed characterisation of the changes that weight loss causes in human fat tissue by analysing hundreds of thousands of cells. They found a range of positive effects, including clearing out of damaged, ageing cells and increased metabolism of harmful fats.

The researchers say the findings help to better understand how weight loss leads to health improvements at a molecular level. In  the future this could help to inform the development of therapies for diseases such as type 2 diabetes.

The study

The study, published in Nature, compared samples of fat tissue from healthy weight individuals with samples from people with severe obesity, meaning a BMI over 35, undergoing bariatric weight loss surgery.

The weight loss group had fat samples taken during surgery and more than five months after surgery, at which point they had lost an average of 25kg.

Lipid recycling

The researchers, who were from the Medical Research Council (MRC) Laboratory of Medical Sciences and Imperial College London, analysed gene expression in more than 170,000 cells that made up the fat tissue samples, from 70 people.

They unexpectedly found that weight loss triggers the breakdown and recycling of fats called lipids.

This recycling process could be responsible for burning energy and reversing the harmful build-up of lipids in other organs like the liver and pancreas.

The researchers say that further study will be needed to establish if lipid recycling is linked to the positive effects of weight loss on health, such as remission of type 2 diabetes.

Senescent cells

They also found that the weight loss cleared out senescent cells, which are ageing and damaged cells that accumulate in all tissues.

The senescent cells cause harm because they no longer function properly and release signals that lead to tissue inflammation and scarring.

Immune system

In contrast, the researchers found that weight loss did not improve the effects of obesity on certain aspects of the immune system.

They found that inflammatory immune cells, which infiltrated the fat of people with obesity, did not fully recover even after weight loss.

This type of inflammatory cell memory could be harmful in the long term if people regain weight.

Detailed map of what drives health benefits

Dr William Scott, from the MRC Laboratory of Medical Sciences and from Imperial College London, who led the study, said:

We’ve known for a long time that weight loss is one of the best ways to treat the complications of obesity, such as diabetes, but we haven’t fully understood why. This study provides a detailed map of what may actually be driving some of these health benefits at a tissue and cellular level.

Fat tissues have many underappreciated health impacts, including on blood sugar levels, body temperature, hormones that control appetite, and even reproductive health.

We hope that new information from studies like ours will start to pave the way for developing better treatments for diabetes and other health problems caused by excess body fat.

Source: UK Research and Innovation

Large-scale Review Finds that Antidepressant Withdrawal Symptoms Are Rare

The largest review of ‘gold standard’ antidepressant withdrawal studies to date has identified the type and incidence of symptoms experienced.

Photo by Towfiqu barbhuiya on Unsplash

The largest review of ‘gold standard’ antidepressant withdrawal studies to date has identified the type and incidence of symptoms experienced by people discontinuing antidepressants, finding most people do not experience severe withdrawal.

In a systematic review and meta-analysis of previous randomised controlled trials relating to antidepressant withdrawal, a team of researchers led by Imperial College London and King’s College London concluded that, while participants who stopped antidepressants did experience an average of one more symptom than those who continued or were taking placebos, this was not enough to be judged as significant. The results are out now in JAMA Psychiatry.

The most common symptoms were dizziness, nausea, vertigo and nervousness. Importantly, depression was not a symptom of withdrawal from antidepressants, and was more likely to reflect illness recurrence.

Researchers at Imperial College London, King’s College London, UCL and UK collaborators say their study provides much needed, clearer guidance for clinicians, patients and policymakers.

Dr Sameer Jauhar, lead author, at Imperial College London, said: “Our work should reassure the public because we replicated other findings, from high-quality studies, and have highlighted the clinical symptoms to look out for. Despite previous concern about stopping antidepressants, our work finds that most people do not experience severe withdrawal, in terms of additional symptoms. Importantly, depression relapse was not linked to antidepressant withdrawal in these studies, suggesting that if this does occur, people will need to see their health professional to rule out a recurrence of their depressive illness.”

Clinical academics from around the UK worked collaboratively to conduct the largest and most rigorous analysis of randomised controlled trials in antidepressant withdrawal, examining data from 50 trials across multiple conditions. The data involved a total of 17,828 participants, with an average age of 44 years, of whom 70% were female.  Two meta-analyses were conducted, one of the trials that used a standardised measure known as the Discontinuation Emergent Signs and Symptoms scale (DESS), and the other of the trials that used various other scales.

Across antidepressants, irrespective of type taken, the number of extra symptoms generally equated to one more symptom on the 43-symptom item scale. In placebo-controlled randomised controlled trials, the most common symptoms across antidepressants were dizziness (7.5% vs 1.8%), nausea (4.1% vs 1.5%), vertigo (2.7% vs 0.4%) and nervousness (3% vs 0.8%).

Experiencing just one symptom is below the 4 or more cutoff for clinically important discontinuation syndrome. 

The nature and rates of different symptoms varied between antidepressants, and some symptoms were also seen with placebo. This helped to clarify which symptoms were likely to be illness recurring, such as the participant relapsing into depression.

The data involved different types of antidepressants, including the serotonin-norepinephrine reuptake inhibitors (SNRIs) venlafaxine and duloxetine; the selective serotonin reuptake inhibitors escitalopram, sertraline and paroxetine; agomelatine, which is a melatonin receptor agonist and selective serotonin receptor antagonist; and vortioxetine, which inhibits the reuptake of serotonin as well as partial agonist and antagonist effects on various serotonin receptors.

The most symptoms were seen with discontinuance of venlafaxine, where approximately 20% of people suffered from dizziness, compared to 1.8% taking placebo. With vortioxetine, fewer than one extra symptom was seen on the standardised discontinuation scale. No extra symptoms were seen with agomelatine.

Adding non-placebo controlled studies increased these rates slightly; dizziness (11.8%, nightmares 8.1%, nervousness 7.6%, nausea 5.8%).

Relapse of depression was not seen in those withdrawing from antidepressants, even in people with existing depression.

The review included studies with different discontinuation regimes, but in the majority of studies (44), people either discontinued abruptly or tapered over 1 week.

Michail Kalfas, of the Institute of Psychiatry, Psychology & Neuroscience at King’s College Londonsaid“While uncommon, our study highlights that there could be a sub-group of people who develop more severe withdrawal symptoms than the wider population of antidepressant users. Our focus must now turn to look at the pharmacological basis for this reaction, and ask whether it relates to the way they metabolise these drugs.”

In terms of study limitations, 38 of the trials followed people up for up to two weeks post-discontinuation (the time period one would expect most discontinuation symptoms to occur), so researchers say this limits long-term conclusions. However, they note that findings from the 2021 UCL-led ANTLER trial involving long-term antidepressant users – which was included in this review – suggested severe withdrawal is infrequent, even after prolonged use.

The study follows recent concerns about the effects of stopping antidepressants, as well as various guidance changes on their prescribing. This current meta-analysis helps resolve the debate by showing that withdrawal is a real and drug-specific phenomenon, though not an inevitable outcome.

Source: Imperial College London

Researchers Use Fitbits to Predict Children’s Surgery Complications

Photo by Natanael Melchor on Unsplash

Although postoperative complications, such as infections, can pose significant health risks to children after undergoing surgical procedures, timely detection following hospital discharge can prove challenging.

A new study from Northwestern University, along with other institutions, is the first to use consumer wearables to quickly and precisely predict postoperative complications in children and shows potential for facilitating faster treatment and care. The study appears in Science Advances.

“Today, consumer wearables are ubiquitous, with many of us relying on them to count our steps, measure our sleep and more,” said senior author Arun Jayaraman, professor at Northwestern University Feinberg School of Medicine and a scientist at Shirley Ryan AbilityLab. “Our study is the first to take this widely available technology and train the algorithm using new metrics that are more sensitive in detecting complications. Our results suggest great promise for better patient outcomes and have broad implications for paediatric health monitoring across various care settings.”

How the study worked

As part of the study, commercially available Fitbit devices were given to 103 children for 21 days immediately after appendectomy, the most common surgery in children, which results in complications up to 38% of the time. Rather than just using the metrics automatically captured by the Fitbit to identify signs of complications (eg, low activity, high heart rate, etc.), Shirley Ryan AbilityLab scientists trained the algorithm using new metrics related to the circadian rhythms of a child’s activity and heart rate patterns. 

In the process, they found such metrics were more sensitive to picking up complications than the traditional metrics. In fact, in analysing the data, scientists were able to retrospectively predict postoperative complications up to three days before formal diagnosis with 91% sensitivity and 74% specificity. 

“Historically, we have been reliant upon subjective reporting from children – who often have greater difficulty articulating their symptoms – and their caregivers following hospital discharge. As a result, complications are not always caught right away,” said study author Dr Fizan Abdullah, who at the time of the study was an attending physician of paediatric surgery at Ann & Robert H. Lurie Children’s Hospital of Chicago and a professor at Feinberg. “By using widely available wearables, coupled with this novel algorithm, we have an opportunity to change the paradigm of postoperative monitoring and care – and improve outcomes for kids in the process.”

What’s next?

This research is part of a four-year National Institutes of Health-funded project. As a next step, scientists plan to transition this approach into a real-time (vs retrospective) system that analyses data automatically and sends alerts to children’s clinical teams. 

“This study reinforces wearables’ potential to complement clinical care for better patient recoveries,” said Hassan M.K. Ghomrawi, vice chair of research and innovation in the department of orthopaedic surgery at University of Alabama at Birmingham. “Our team is eager to enter the next phase of research exploration.”

Source: Northwestern University

From Injury to Agony: The Brain Pathway that Turns Pain into Suffering

Salk scientists uncover a key neural circuit in mice that gives pain its emotional punch, opening new doors for treating fibromyalgia, migraine, and PTSD


CGRP-expressing neurons (green) in the parvocellular subparafascicular nucleus (SPFp) of the thalamus.
Credit: Salk Institute

More than just a physical sensation, pain also carries emotional weight. That distress, anguish, and anxiety can turn a fleeting injury into long-term suffering.

Salk Institute researchers have now identified a brain circuit that gives physical pain its emotional tone, revealing a new potential target for treating chronic and affective pain conditions such as fibromyalgia, migraine, and post-traumatic stress disorder (PTSD).

Published in PNAS, the study identifies a group of neurons in a central brain area called the thalamus that appears to mediate the emotional or affective side of pain in mice. This new pathway challenges the textbook understanding of how pain is processed in the brain and body.

“For decades, the prevailing view was that the brain processes sensory and emotional aspects of pain through separate pathways,” says senior author Sung Han, associate professor and holder of the Pioneer Fund Developmental Chair at Salk. “But there’s been debate about whether the sensory pain pathway might also contribute to the emotional side of pain. Our study provides strong evidence that a branch of the sensory pain pathway directly mediates the affective experience of pain.”

The physical sensation of pain allows immediate detection, assessment of its intensity, and identification of its source. The affective part of pain is what makes it so unpleasant – the emotional discomfort motivates avoidance.

This is a critical distinction. Most people start to perceive pain at the same stimulus intensities, meaning the sensory side of pain is processed similarly. But the ability to tolerate pain varies greatly. The degree of suffering or feeling threatened by pain is determined by affective processing, and if that becomes too sensitive or lasts too long, it can result in a pain disorder. This makes it important to understand which parts of the brain control these different dimensions of pain.

Sensory pain was thought to be mediated by the spinothalamic tract, a pathway that sends pain signals from the spinal cord to the thalamus, which then relays them to sensory processing areas across the brain.

Affective pain was generally thought to be mediated by a second pathway called the spinoparabrachial tract, which sends pain information from the spinal cord into the brainstem.

However, previous studies using older research methods have suggested the circuitry of pain may be more complex. This long-standing debate inspired Han and his team to revisit the question with modern research tools.

Using advanced techniques to manipulate the activity of specific brain cells, the researchers discovered a new spinothalamic pathway in mice. In this circuit, pain signals are sent from the spinal cord into a different part of the thalamus, which has connections to the amygdala, the brain’s emotional processing center. This particular group of neurons in the thalamus can be identified by their expression of CGRP (calcitonin gene-related peptide), a neuropeptide originally discovered in Professor Ronald Evans’ lab at Salk.

When the researchers “turned off” (genetically silenced) these CGRP neurons, the mice still reacted to mild pain stimuli, such as heat or pressure, indicating their sensory processing was intact. However, they didn’t seem to associate lasting negative feelings with these situations, failing to show any learned fear or avoidance behaviors in future trials. On the other hand, when these same neurons were “turned on” (optogenetically activated), the mice showed clear signs of distress and learned to avoid that area, even when no pain stimuli had been used.

“Pain processing is not just about nerves detecting pain; it’s about the brain deciding how much that pain matters,” says first author Sukjae Kang, a senior research associate in Han’s lab. “Understanding the biology behind these two distinct processes will help us find treatments for the kinds of pain that don’t respond to traditional drugs.”

Many chronic pain conditions—such as fibromyalgia and migraine—involve long, intense, unpleasant experiences of pain, often without a clear physical source or injury. Some patients also report extreme sensitivity to ordinary stimuli like light, sound, or touch, which others would not perceive as painful.

Han says overactivation of the CGRP spinothalamic pathway may contribute to these conditions by making the brain misinterpret or overreact to sensory inputs. In fact, transcriptomic analysis of the CGRP neurons showed that they express many of the genes associated with migraine and other pain disorders.

Notably, several CGRP blockers are already being used to treat migraines. This study may help explain why these medications work and could inspire new nonaddictive treatments for affective pain disorders.

Han also sees potential relevance for psychiatric conditions that involve heightened threat perception, such as PTSD. Growing evidence from his lab suggests that the CGRP affective pain pathway acts as part of the brain’s broader alarm system, detecting and responding to not only pain but a wide range of unpleasant sensations. Quieting this pathway with CGRP blockers could offer a new approach to easing fear, avoidance, and hypervigilance in trauma-related disorders.

Importantly, the relationship between the CGRP pathway and the psychological pain associated with social experiences like grief, loneliness, and heartbreak remains unclear and requires further study.

“Our discovery of the CGRP affective pain pathway gives us a molecular and circuit-level explanation for the difference between detecting physical pain and suffering from it,” says Han. “We’re excited to continue exploring this pathway and enabling future therapies that can reduce this suffering.”

Source: Salk Institute

Major Study Identifies Four Biologically Distinct Subtypes of Autism

Photo by Peter Burdon on Unsplash

Researchers at Princeton University and the Simons Foundation have identified four clinically and biologically distinct subtypes of autism, marking a transformative step in understanding the condition’s genetic underpinnings and potential for personalised care.

Analysing data from over 5000 children in SPARK, an autism cohort study funded by the Simons Foundation, the researchers used a computational model to group individuals based on their combinations of traits. The team used a “person-centred” approach that considered a broad range of over 230 traits in each individual, from social interactions to repetitive behaviours to developmental milestones, rather than searching for genetic links to single traits. 

This approach enabled the discovery of clinically relevant autism subtypes, which the researchers linked to distinct genetic profiles and developmental trajectories, offering new insights into the biology underlying autism. Their results were published July 9 in Nature Genetics.

“Understanding the genetics of autism is essential for revealing the biological mechanisms that contribute to the condition, enabling earlier and more accurate diagnosis, and guiding personalised care,” said senior study author Olga Troyanskaya, director of Princeton Precision Health, professor of computer science and the Lewis-Sigler Institute for Integrative Genomics at Princeton, and deputy director for genomics at the Center for Computational Biology of the Simons Foundation’s Flatiron Institute.

The study defines four subtypes of autism: Social and Behavioural Challenges, Mixed ASD with Developmental Delay, Moderate Challenges, and Broadly Affected. Each subtype exhibits distinct developmental, medical, behavioural and psychiatric traits, and importantly, different patterns of genetic variation. 

  • Individuals in the Social and Behavioural Challenges group show core autism traits, including social challenges and repetitive behaviours, but generally reach developmental milestones at a pace similar to children without autism.  They also often experience co-occurring conditions like ADHD, anxiety, depression or obsessive-compulsive disorder alongside autism. One of the larger groups, this constitutes around 37% of the participants in the study.
  • The Mixed ASD with Developmental Delay group tends to reach developmental milestones, such as walking and talking, later than children without autism, but usually does not show signs of anxiety, depression or disruptive behaviours. “Mixed” refers to differences within this group with respect to repetitive behaviours and social challenges. This group represents approximately 19% of the participants.
  • Individuals with Moderate Challenges show core autism-related behaviours, but less strongly than those in the other groups, and usually reach developmental milestones on a similar track to those without autism. They generally do not experience co-occurring psychiatric conditions. Roughly 34% of participants fall into this category.
  • The Broadly Affected group faces more extreme and wide-ranging challenges, including developmental delays, social and communication difficulties, repetitive behaviours and co-occurring psychiatric conditions like anxiety, depression and mood dysregulation. This is the smallest group, accounting for around 10% of the participants.

“These findings are powerful because the classes represent different clinical presentations and outcomes, and critically we were able to connect them to distinct underlying biology,” said Aviya Litman, a PhD student at Princeton. 

Distinct genetics behind the subtypes

For decades, autism researchers and clinicians have been seeking robust definitions of autism subtypes to aid in diagnosis and care. Autism is known to be highly heritable, with many implicated genes. 

“While genetic testing is already part of the standard of care for people diagnosed with autism, thus far, this testing reveals variants that explain the autism of only about 20% of patients,” said co-author Jennifer Foss-Feig, a clinical psychologist at the Icahn School of Medicine at Mount Sinai and vice president and senior scientific officer at the Simons Foundation Autism Research Initiative (SFARI). This study takes an approach that differs from classic gene discovery efforts by identifying robust autism subtypes that are linked to distinct types of genetic mutations and affected biological pathways.

For example, children in the Broadly Affected group showed the highest proportion of damaging de novo mutations, while only the Mixed ASD with Developmental Delay group was more likely to carry rare inherited genetic variants. While children in both of these subtypes share some important traits like developmental delays and intellectual disability, these genetic differences suggest distinct mechanisms behind superficially similar clinical presentations. 

“These findings point to specific hypotheses linking various pathways to different presentations of autism,” said Litman, referring to differences in biology between children with different autism subtypes.

Moreover, the researchers identified divergent biological processes affected in each subtype. “What we’re seeing is not just one biological story of autism, but multiple distinct narratives,” said Natalie Sauerwald, associate research scientist at the Flatiron Institute and co-lead author. “This helps explain why past genetic studies often fell short – it was like trying to solve a jigsaw puzzle without realising we were actually looking at multiple different puzzles mixed together. We couldn’t see the full picture, the genetic patterns, until we first separated individuals into subtypes.”

Autism biology unfolds on different timelines

The team also found that autism subtypes differ in the timing of genetic disruptions’ effects on brain development. Genes switch on and off at specific times, guiding different stages of development. While much of the genetic impact of autism was thought to occur before birth, in the Social and Behavioural Challenges subtype – which typically has substantial social and psychiatric challenges, no developmental delays, and a later diagnosis – mutations were found in genes that become active later in childhood. This suggests that, for these children, the biological mechanisms of autism may emerge after birth, aligning with their later clinical presentation.

“By integrating genetic and clinical data at scale, we can now begin to map the trajectory of autism from biological mechanisms to clinical presentation,” said co-author Chandra Theesfeld, senior academic research manager at the Lewis-Sigler Institute and Princeton Precision Health.

A paradigm shift for autism research

This study builds on more than a decade of autism genomics research led by Troyanskaya and collaborators. It is enabled by the close integration of interdisciplinary expertise in genomics, clinical psychology, molecular biology, computer science and modelling, and computational biology.

“The Princeton Precision Health initiative uses artificial intelligence and computational modelling to integrate across biological and clinical data,” said Jennifer Rexford, Princeton University provost and Gordon Y.S. Wu Professor in Engineering. “This initiative could not exist without the University’s charitable endowment. Our investments allow experts to collaborate across a range of disciplines to conduct transformative research that improves human health, including the potential for major advances in the diagnosis and treatment of autism made possible in this exciting project.” 

“It’s a whole new paradigm, to provide these groups as a starting point for investigating the genetics of autism,” said Theesfeld. Instead of searching for a biological explanation that encompasses all individuals with autism, researchers can now investigate the distinct genetic and biological processes driving each subtype.

This shift could reshape both autism research and clinical care – helping clinicians anticipate different trajectories in diagnosis, development and treatment. “The ability to define biologically meaningful autism subtypes is foundational to realising the vision of precision medicine for neurodevelopmental conditions,” said Sauerwald.

While the current work defines four subtypes, “this doesn’t mean there are only four classes,” said Litman. “It means we now have a data-driven framework that shows there are at least four – and that they are meaningful in both the clinic and the genome.”

Looking ahead

Beyond its contributions to understanding autism subtypes and their underlying biology, the study offers a powerful framework for characterising other complex, heterogeneous conditions and finding clinically relevant disease subtypes. As Theesfeld put it: “This opens the door to countless new scientific and clinical discoveries.”

Source: Princeton University

Adaptive Spine Board Could Revolutionise ER Transport

ASB overlay is divided into five distinct sections—head and neck, upper trunk, buttocks and pelvis, thighs, and feet and heels

In combat zones and emergency rescues, rapid evacuation and treatment can mean the difference between life and death. But prolonged immobilisation during transport poses another life-threatening risk: pressure injuries.

A newly developed adaptive spine board (ASB) overlay aims to change that, offering an innovative solution to prevent pressure injuries and dramatically improve patient outcomes. Developed by researchers at The University of Texas at Arlington and UT Southwestern Medical School, the adaptive spine board sits atop a standard stretcher or spine board, using air-cell technology to redistribute pressure more effectively than traditional evacuation surfaces. The team’s newly published study in the Journal of Rehabilitation and Assistive Technologies Engineering shows the ASB outperforms other immobilisation options.

“The ability to dynamically adjust pressure so that no vulnerable body regions experience excessive weight is a breakthrough for medical evacuation,” said Muthu B.J. Wijesundara, principal research scientist at the University of Texas at Arlington Research Institute. “This innovation could set a new standard in casualty transport protocols.”

Also called bedsores or ulcers, pressure injuries result from prolonged pressure on the skin and underlying soft tissue, leading to cell death, tissue breakdown and open wounds. They are a constant risk for trauma patients during long-range transport, which sometimes lasts more than 16 hours. Research shows that more than 50% of casualties transported during the Iraq War developed pressure injuries before reaching a hospital.

While some existing technologies, such as vacuum spine boards, can help redistribute pressure, their effectiveness is limited. Many conventional supports fail to keep pressure below the thresholds recommended to prevent injury. Military stretchers and pads have shown to create high-pressure points on vulnerable areas of the body, including the back of the head, base of the spine, buttocks and heels.

“Beyond military use, the ASB overlay could prove valuable in civilian medical transport, particularly for spinal injury patients who are at high risk for pressure ulcers,” Dr Wijesundara said. “The research also highlights potential applications in other environments where prolonged immobilisation is necessary, such as disaster relief and space exploration.”

The ASB overlay features a multi-segmented air-cell design that target pressure-prone areas more effectively than previous solutions. It is divided into five distinct sections—head and neck, upper trunk, buttocks and pelvis, thighs, and feet and heels—each equipped with sensor-driven pressure modulation for responsive, localised support.

“One key innovation is the system’s ability to autonomously adjust the air-cell pressure to maintain optimal distribution for each patient,” Wijesundara said. “We developed an algorithm that compensates for environmental variables, such as temperature and barometric pressure changes, ensuring consistent performance across varying conditions. Testing showed that the ASB overlay outperformed typical equipment used in casualty transport.”

For critically injured patients, pressure injuries can significantly complicate treatment and recovery, leading to longer hospital stays, higher infection risks and additional surgeries. They’re also costly. The Agency for Healthcare Research and Quality (AHRQ) estimates that pressure injuries in the US can cost up to $151 700 per case, adding $11.6 billion in additional health care expenses annually. Alarmingly, the AHRQ also reports that approximately 60 000 patients die each year because of pressure injuries. The ASB overlay’s advanced pressure modulation could help mitigate these risks—especially for patients who cannot be repositioned during extended transport.

The research team is now planning additional studies to improve the device’s usability in real-world conditions. As the military increasingly relies on prolonged aeromedical evacuation, such advancements are critical for enhancing patient care in conflict zones.

Everything We Thought About Running Injury Development Was Wrong, Study Shows

Photo by Barbara Olsen on Pexels

A new study from Aarhus University turns our understanding of how running injuries occur upside down. The research project, published in The BMJ, is the largest of its kind ever conducted and involves over 5000 participants. It shows that running-related overuse injuries do not develop gradually over time, as previously assumed, but rather suddenly – often during a single training session.

“Our study marks a paradigm shift in understanding the causes of running-related overuse injuries. We previously believed that injuries develop gradually over time, but it turns out that many injuries occur because runners make training errors in a single training session,” explains Associate Professor Rasmus Ø. Nielsen from the Department of Public Health at Aarhus University, who is the lead author of the study.

The study followed 5205 runners from 87 countries over 18 months and shows that injury risk increases exponentially when runners increase their distance in a single training session compared to their longest run in the past 30 days. The longer the run becomes, the higher the injury risk.

Incorrect guidance for millions of runners

According to Rasmus Ø. Nielsen, the results cast critical light on how the tech industry has implemented so-called “evidence.” Millions of sports watches worldwide are equipped with software that guides runners about their training – both for training optimisation and injury prevention.

However, the algorithm used for injury prevention is built on very thin scientific grounds, according to Rasmus Ø. Nielsen.

“This concretely means that millions of runners receive incorrect guidance from their sports watches every day. They think they are following a scientific method to avoid injuries, but in reality they are using an algorithm that cannot predict injury risk at all,” he says.

Non-existing evidence behind guidance

The current algorithm, called “Acute:Chronic Workload Ratio” (ACWR), was introduced in 2016 and is now implemented in equipment from companies that produce sports watches, while organisations and clinicians, such as physiotherapists, also use the algorithm.

The ACWR algorithm calculates the ratio between acute load (last week’s training) and chronic load (average of the past 3 weeks). The algorithm recommends a maximum 20% increase in training load to minimise injury risk.

According to Rasmus Ø. Nielsen, the algorithm was originally developed for team sports and was based on a study with 28 participants. Due to the few participants in the study combined with data manipulation, the evidence base for using the algorithm to prevent running injuries is therefore “non-existent.”

Realtime guidance

The research team has therefore worked for the past eight years to develop a new algorithm that will be much better at preventing injuries for runners.

Rasmus Ø. Nielsen emphasises that he and the other researchers behind the study have no commercial interests in launching a new algorithm as a potential replacement for a method he himself criticises.

The algorithm will be made freely available to runners, companies, clinicians and organisations who can use it actively to guide training and injury prevention.

Rasmus Ø. Nielsen hopes that the new insights will be implemented in existing technology.

“I imagine, for example, that sports watches with our algorithm will be able to guide runners in real-time during a run and give an alarm if they run a distance where injury risk is high. Like a traffic light that gives green light if injury risk is low; yellow light if injury risk increases and red light when injury risk becomes high,” explains Rasmus Ø. Nielsen.

Source: Aarhus University

Student Designs a Prostate Checking Device to Replace the Digital Exam

Pro check, designed by Loughborough University student Devon Tyso.

A Loughborough University student has developed a new medical device that could transform how prostate health is assessed and monitored.

Devon Tyso, a Product Design and Technology student, has designed ‘PRO check’, an innovative tool designed to replace the traditional digital rectal examination (DRE), which involves a doctor manually assessing the prostate with a finger.

According to Devon, the current approach is heavily reliant on a clinician’s subjective judgement and experience, and many see the method as ‘intrusive’.

“As one in seven men will get prostate cancer, it’s vital to detect abnormalities early and track changes over time,” said Devon, “The current examination method involves a lot of guesswork.

“PRO check provides objective, measurable data and allows prostate health to be visualised – enabling more accurate diagnosis, and improved long-term monitoring.

“Having a device conduct the exam may also feel less invasive, which may encourage more men to get checked, potentially catching issues earlier.”

How the device works

Designed for use by GPs during routine prostate assessments, PRO check allows doctors to evaluate the size and texture of the prostate — two key indicators of potential health issues — in a more objective and consistent way than the traditional digital rectal examination.

The device is a handheld probe, and it is covered with a condom before being inserted into the body. Once in position, the condom inflates to different pressures, pressing against the surface of the prostate, causing it to compress. A laser grid is projected onto the inner surface of the condom so the shape of the underlying prostate can be captured.

Stereoscopic cameras capture images of the laser grid, tracking where the gridlines intersect and how these intersections shift as pressure changes. This information is then fed into mathematical equations to create 3D images — or ‘topographical representations’ — that reveal the prostate’s shape and surface structure under different pressures.

Studying the prostate’s surface details could help clinicians identify areas requiring further investigation. Healthy prostate tissue is typically soft and compressible, so regions that appear stiff or resist pressure could indicate potential abnormalities and warrant further investigation.

The device can also produce data on prostate volume – one of the measurements used to calculate prostate-specific antigen (PSA) density, which helps assess prostate cancer risk. Devon says currently volume estimates are often based on a clinician’s best judgement.

In addition, data from PRO check can be used to generate a compressibility-versus-pressure graph – a novel data type not currently available in clinical practice. This graph shows how the prostate compresses at different pressure levels, which Devon hopes could offer new insights into prostate health and complement existing diagnostic tools.

PRO check is designed to integrate with artificial intelligence, enabling automatic extraction of video data, real-time calculations, and the generation of 3D images for live display on a laptop or tablet during the examination.

The idea is that all examination data from PRO check would be stored on the patient’s records, helping to build a personalised prostate health profile that can be tracked and monitored over time.

Inspiration

Devon’s inspiration for PRO check came from a mix of personal experience – after his grandfather’s prostate cancer diagnosis – and unexpected technical research.

“It really hit home how common prostate issues are after my family member was found to have an enlarged prostate,” said the 22-year-old from Cardiff, “I realised nearly everyone I spoke to about it knew someone affected by it.

“When I started looking into prostate examinations, I kept thinking ‘how can a doctor remember what your prostate felt like four months ago?’ and how horrible it must be just be told whether you’re fine or not without seeing any data or anything visual.”

While researching non-invasive ways to assess tissue structure inside the body, Devon came across a technique used by NASA to map the surface of asteroids — projecting laser grids onto them, capturing images with satellite-mounted cameras, and analysing the gridline intersections to reveal the contours of the surface.

“I saw that NASA were mapping surface heights on a massive scale, and I thought – if they can do that in space, why can’t we use similar principles to examine something here on Earth?” said Devon, “I’ve basically used the exact same technique and scaled it down for PRO check.”

Prototypes

Devon designed PRO check as part of his final year project – which was exhibited at the School of Design and Creative Arts’ 2025 Degree Show – and has prototyped several of its key components.

He has built and tested two working prototypes. The first demonstrates how a laser grid and camera can be setup to map the surface of the prostate.

Devon designed a custom rig that enabled him to capture images of a laser grid projected onto different silicone prostate models — representing a healthy gland, a small tumour, a large tumour, and an enlarged prostate — from an optimal angle using a smartphone camera.

First prototype of pro checkPRO check prototype one demonstrated how laser gridlines and a camera can be used to image the surface of the prostate.

The second prototype features electronics that inflate a small balloon at controlled pressures, regulated by a pressure-sensing chip. Devon consulted three healthcare professionals to measure the pressure typically applied during prostate exams and replicated those levels in his design.

Devon tested the prototype using the silicone prostate models but encased them in a sponge disc to simulate surrounding tissue.

Devon manually extracted data on the gridline intersections from the camera footage and applied mathematical equations to generate 3D images of the prostate surfaces and surrounding tissue under different pressures.

Next steps

Devon hopes to collaborate with medical professionals and product developers to turn PRO check into a fully realised medical device.

When speaking about his ultimate goal, Devon said: “I’d love to see this used in GP surgeries across the UK one day.

“With early detection being so critical, anything that helps men get checked sooner and more comfortably – and provides reliable data and visualisations – has huge potential. I really believe this could make a difference.”

Further information on PRO check can be found on the Degree Show website.

Source: Loughborough University

Hearing Devices Increase Social Connectedness and Reduce Mortality

Photo by Brett Sayles

Hearing loss doesn’t just affect how people hear the world — it can also change how they connect with it. New research from the University of Southern California, published in JAMA Otolaryngology – Head & Neck Surgery, is the first to link hearing aids and cochlear implants, surgically implanted devices that help those with profound hearing loss perceive sound, to improved social lives among adults with hearing loss. 

“We found that adults with hearing loss who used hearing aids or cochlear implants were more socially engaged and felt less isolated compared to those who didn’t use them,” said lead researcher Janet Choi, MD, MPH, an otolaryngologist with Keck Medicine at USC. “This suggests that hearing devices may help prevent the social disconnection and broader health consequences that can follow untreated hearing loss.” 

Hearing loss affects an estimated 40 million American adults, yet many go untreated. When left unaddressed, hearing loss can make communication difficult, leading people to withdraw from conversations and social activities, according to Choi.  

Previous research has shown that over time, social withdrawal can reduce mental stimulation and increase the risk of loneliness, anxiety, depression, cognitive decline and dementia. It has also linked chronic social isolation to biological and neurological changes, including increased brain inflammation and alterations in brain structure.  

“Understanding the link between hearing loss, hearing device use and social isolation is crucial,” said Choi. “Until this study, it has been unclear whether hearing devices could help reverse the isolation.”   

Choi and her fellow researchers conducted a comprehensive, systematic review and meta-analysis of 65 previously published studies, encompassing over five thousand participants, on how hearing aids and cochlear implants affect three key measures: social quality of life, perceived social handicap, which refers to the limitations and frustrations hearing loss can create in social situations, and loneliness.  

The researchers found that adults using hearing devices feel more socially connected and less limited in social situations. They are better able to engage in group conversations and feel more at ease in noisy or challenging listening environments. Participants also reported feeling less socially handicapped by their hearing loss, with fewer barriers and frustrations during interactions and an improved ability to stay engaged without feeling excluded. This increased confidence can help users connect more easily with family, friends and colleagues, leading to stronger feelings of belonging and reduced social anxiety. The study also suggested hearing devices may reduce loneliness, although further research is needed in this area, according to Choi. 

Those with cochlear implants reported the most improvement in their social quality of life. This is likely because cochlear implants offer greater hearing restoration than hearing aids, especially for individuals with more severe hearing loss. As a result, they may experience more noticeable improvements in social engagement once their hearing is restored. 

While it was outside the scope of the study to measure how better social lives relate to improved cognitive outcomes, Choi believes there may be a connection, as previous research has found managing hearing loss may be key to reducing the risk of cognitive decline and dementia. “While our study didn’t directly measure cognitive outcomes, the improvements we saw in communication and social engagement suggest that by restoring clearer communication, hearing devices may help preserve cognitive health by keeping the brain more actively involved and people more connected,” Choi said. 

This research follows a January 2024 study by Choi showing that adults with hearing loss who use hearing aids have an almost 25% lower risk of mortality, suggesting that treating hearing loss can improve lifespan as well as social quality of life.  

“These new findings add to a growing body of research showing that hearing health is deeply connected to overall well-being,” said Choi. “We hope this encourages more people to seek treatment and helps clinicians start conversations with patients about how hearing devices can improve their quality of life.”

Source: University of Southern California – Health Sciences