Month: March 2026

AI Tools for Cancer Rely on Shaky Shortcuts

Small cell lung cancer cells (green and blue) that metastasised to the brain in a laboratory mouse recruit brain cells called astrocytes (red) for their protection. Credit: Fangfei Qu

Artificial intelligence tools are increasingly being developed to predict cancer biology directly from microscope images, promising faster diagnoses and cheaper testing. But new research from the University of Warwick, published in Nature Biomedical Engineering, suggests that many of these systems may be using visual shortcuts rather than true biology – raising concerns that some AI pathology tools are currently too unreliable for real-world patient care.

“It’s a bit like judging a restaurant’s quality by the queue of people waiting to get in: it’s a useful shortcut, but it’s not a direct measure of what’s happening in the kitchen,” says Dr Fayyaz Minhas, Associate Professor and principal investigator of the Predictive Systems in Biomedicine (PRISM) Lab in the Department of Computer Science, University of Warwick, and lead author of the study.

“Many AI pathology models are doing the same thing, relying on correlations between biomarkers or on obvious tissue features, rather than isolating biomarker-specific signals. And when conditions change, these shortcuts often fall apart.”

To reach this conclusion, the researchers analysed more than 8000 patient samples across four major cancer types – breast, colorectal, lung and endometrial – and compared the performance of leading machine learning approaches. While the models often achieved high headline accuracy, the team found this frequently came from statistical “shortcuts.”

For example, instead of detecting mutations in the cancer-associated BRAF gene, a model might learn that BRAF mutations often occur alongside another clinical feature such as microsatellite instability (MSI). The system then learns to use this combination of cues to predict BRAF status rather than learning the causal BRAF signal itself – meaning accurate cancer predictions work only when these biomarkers co-occur and become unreliable when they do not.

Kim Branson, SVP Global Head of Artificial Intelligence and Machine Learning, GSK and co-author says, “We’ve found that predicting a BRAF mutation by looking at correlated features like MSI is often like predicting rain by looking at umbrellas – it works, but it doesn’t mean you understand meteorology.

“Crucially, if a model cannot demonstrate information gain above a simple pathologist-assigned grade, we haven’t advanced the field; we’ve just automated a shortcut. The roadmap for the next generation of pathology AI isn’t necessarily bigger models; it’s stricter evaluation protocols that force algorithms to stop cheating and learn the hard biology.”

When performance of AI models was assessed within stratified patient subgroups, such as only high-grade breast cancers or only MSI-positive tumours, accuracy fell substantially, revealing that the models were dependent on shortcut signals that disappear once confounding factors are controlled.

For certain prediction tasks, the performance advantage of deep learning over human-derived clinical information was modest. AI systems achieved accuracy scores of just over 80% when predicting biomarkers, compared with around 75% using tumour grade alone – a measure already assessed by pathologists.

Machine learning methods can still prove valuable for research, drug development candidate screening and for clinical triaging, screening, or supplementary decision support. However, the researchers argue that future AI tools must move beyond correlation-based learning and adopt approaches that explicitly model biological relationships and causal structure.

They also call for stronger evaluation standards, including subgroup testing and comparison against simple clinical baselines, before looking at deployment in routine care.

Dr Minhas concludes, “This research is not a condemnation of AI in pathology. It is a wake-up call. Current models may perform well in controlled settings but rely on statistical shortcuts rather than genuine biological understanding. Until more robust evaluation standards are in place, these tools should not be seen as replacements for molecular testing, and it is essential that clinicians and researchers understand their limitations and use them with appropriate caution.”

Source: University of Warwick

How Food Shortages Reprogram the Immune Response to Infection

Human neutrophils visualised under a confocal microscope with cell membrane (red) and nucleus (blue). When faced with an infection during food scarcity, stress hormones trigger an immune response dependent on neutrophils, abundant cells that act as immediate, short-lived defenders. Credit: Thai Tran, National Institute of Arthritis and Musculoskeletal and Skin Diseases

When food is scarce, stress hormones direct the immune system to operate in “low power” mode to preserve immune function while conserving energy, according to researchers at Weill Cornell Medicine. This reconfiguration is crucial to combating infections amid food insecurity.

“Both famine and infectious disease have been with us throughout our evolutionary history and often occurred at the same time. Yet little is known about how nutrition affects the immune system,” said senior author Dr Nicholas Collins, an assistant professor of immunology, and a member of the Jill Roberts Institute for Research in Inflammatory Bowel Disease and the Friedman Center for Nutrition at Weill Cornell.

The answer could be important in helping those who are food insecure and face the risk of infectious diseases every day. “Mounting an immune response against infections requires a lot of energy. We have discovered a coordinated system that upholds immune function by shifting the composition and metabolism of immune cells,” Dr Collins said.

The results, published in Immunity, found that mice on a calorie-restricted diet fought off infection as well as mice that were fully fed, but did so while using very little glucose. This was possible thanks to glucocorticoids, stress hormones known for their role in regulating blood glucose. The researchers determined that glucocorticoids acted like master conductors, reorganizing immune cells and their energy usage to provide a survival advantage.

The research was co-led by Luisa Menezes-Silva, a visiting graduate student from the University of São Paulo, Brazil; Dr Mingeum Jeong, a postdoctoral associate; and Dr Seong-Ji Han, a research associate, all in the Collins lab at Weill Cornell.

Shifting Priorities

To understand the complex interactions involved in an immune response during scarcity, Dr Collins and his team put mice on a 50% restricted-calorie diet and then exposed the animals to bacteria that infect the gut. The mice that were fed a standard diet experienced a metabolic crash – their blood glucose levels and body weight plummeted.

The researchers had expected this would happen to all the animals because mounting an immune response can consume up to 30% of the entire body’s fuel reserves. But in the calorie-restricted mice, the immune system appeared to be functioning perfectly well without using much glucose.

To unravel this enigma, the researchers inventoried the immune cells of the infected animals and discovered that T cells, which normally target invading microbes, were depleted in the calorie-restricted mice. Instead, short-lived neutrophils, which serve as the body’s first responders to infection, were ramped up to twice the normal amount and had measurably enhanced pathogen-killing abilities. The cells seemed to be operating in energy-saving mode, consuming much less glucose than neutrophils from well-fed animals.

“So, this hormone rewires the immune system to eliminate the infection while keeping blood sugar from dropping, which rescues the calorie-restricted animals from malnutrition,” said Dr Collins.

Stress Hormones Lead the Charge

The researchers are breaking new ground by outlining how a sudden fall in food intake triggers glucocorticoid levels to rise, resulting in two major shifts. First, the body repositions certain immune cells – especially naïve T cells – into the bone marrow, which becomes a kind of “safe house” for when the cells are needed. Second, during an infection, glucocorticoids tilt the immune response away from energy‑intensive T cells toward neutrophils, abundant cells that act as immediate, short-lived defenders.

Beyond clearing a current infection, glucocorticoids prepare the immune system for repeat encounters with infectious agents. While the hormones direct killer T cells to stand down and neutrophils to step up, they also ensure memory T cells are preserved for future confrontations.

“Glucocorticoids reduce the immune cells that use up the most energy, while saving those that are critical for protection against future infections,” Dr Collins said. “So, these hormones are involved in every step of the infection-fighting process.”

“Since glucocorticoids are induced not only by nutrient restriction but also by any form of stress, our findings might have broader applicability,” said Dr Collins.

In the meantime, he and his team plan to explore what causes the system to fail when the degree and duration of calorie restriction are more severe. “We looked at reduced food intake over three weeks,” he said. “But when you cross the threshold into malnutrition, the whole system breaks down.” Understanding this collapse could inform better strategies to prevent infectious disease and infection-driven malnutrition in vulnerable populations.

Source: Weill Cornell Medicine

Brain Stimulation can Nudge People to Behave Less Selfishly

Alternating current stimulation in the frontal and parietal lobes of the brain promoted altruistic choices

Photo by ROCKETMANN TEAM

Stimulating two brain areas, nudging them to collectively fire in the same way, increased a person’s ability to behave altruistically, according to a study published February 10th in the open-access journal PLOS Biology by Jie Hu from East China Normal University in China and colleagues from University of Zurich in Switzerland.

As parents raise their kids, they often work to teach them to be kind and to share, to think about other people and their needs – to be altruistic. This unselfish attitude is critical if a society is going to function. And yet, while some people grow up to devote themselves to others, other people still manage to grow up selfish.

To understand what brain areas and connections might underlie individual differences in altruism, the researchers asked 44 participants to complete 540 decisions in a Dictator Game – offering to split an amount of money with someone else, which they then got to keep. Each time, the participant could make more or less money than their partner, but the amounts varied. As the participants played the game, the researchers stimulated their brains with transcranial alternating current stimulation over the frontal and parietal lobes of the brain. The stimulation was set up to make the brain cells in those areas fire together in repetitive patterns, training them all to either gamma or alpha oscillation rhythms.

The authors found that during the alternating current stimulation designed to enhance the synchrony of gamma oscillations in the frontal and parietal lobes, the participants were slightly more likely to make an altruistic choice and offer more money to someone else – even when they stood to make less money than their partner. Using a computational model, the researchers showed that the stimulation nudged the participants’ unselfish preferences, making them consider their partner more when they weighed each monetary offer. The authors note that they did not directly record brain activity during the trials, and so future studies should combine brain stimulation with electroencephalography to show the direct effect of the stimulation on neural activity. But the results suggest that altruistic choices could have a basis in the synchronized activity of the frontal and parietal lobes of the brain.

Coauthor Christian Ruff states, “We identified a pattern of communication between brain regions that is tied to altruistic choices. This improves our basic understanding of how the brain supports social decisions, and it sets the stage for future research on cooperation – especially in situations where success depends on people working together.”

Coauthor Jie Hu notes, “What’s new here is evidence of cause and effect: when we altered communication in a specific brain network using targeted, non-invasive stimulation, people’s sharing decisions changed in a consistent way – shifting how they balanced their own interests against others’.”

Coauthor Marius Moisa concludes, “We were struck by how boosting coordination between two brain areas led to more altruistic choices. When we increased synchrony between frontal and parietal regions, participants were more likely to help others, even when it came at a personal cost.”

Provided by PLOS

Scientists Engineer ‘Living Eye Drop’ to Support Corneal Healing

Photo by Victor Freitas on Pexels

University of Pittsburgh School of Medicine researchers have developed an early-stage, experimental “living eye drop” that uses a naturally occurring eye bacterium to support corneal wound healing.

The proof-of-‑concept study, published in Cell Reports, demonstrates that the harmless eye-dwelling microbe Corynebacterium mastitidis can be genetically modified to secrete an anti-inflammatory therapeutic that promotes healing following corneal injury in a mouse model.

“This is the first demonstration that a microbe that lives on the ocular surface could be engineered to deliver a therapeutic that improves eye health,” said senior author Anthony St. Leger, associate professor of ophthalmology and of immunology and a faculty member of the UPMC Vision Institute. “It opens the door to the idea of ‘living medicine’ for the eye – something you apply once, and it stays, protects and helps the tissue heal.”

Because tears continually wash medications away, treating ocular surface disease often requires multiple daily applications of eye drops. This can limit the effectiveness of therapies for conditions such as corneal abrasions or dry eye disease.

To explore an alternative delivery method, the Pitt team engineered C. mastitidis, a benign bacterium that naturally resides under the eyelid, to continuously secrete cytokine interleukin10 (IL10). In mice, corneas that were gently scratched and treated with the engineered bacteria healed faster than those treated with regular bacteria or saline. When the IL10 receptor was blocked, this benefit disappeared – confirming the therapeutic effect was IL10-dependent.

The researchers also created a version of the microbe that releases human IL10, which improved wound closure in lab-grown cells that make up the outermost layer of human cornea and reduced inflammatory signaling in human immune cells. These studies offer an initial indication that the approach could eventually be adapted for use in people, though substantial development remains.

“What makes this exciting is that the system is modular,” St. Leger explained. “We built it so you can swap in different genes – different cytokines, growth factors or other proteins – to tailor the therapy to specific eye diseases.”

Though promising, the technology is still in early development. The researchers note that many steps must be completed before any clinical translation is possible, including developing built-in “off switches”  to safely and reliably remove or deactivate the engineered bacteria after they are no longer needed.

Source: University of Pittsburgh

Addressing Nursing Challenges in South Africa Through Practical Training and Ongoing Development

Photo by Thirdman

By Donald McMillan, MD at Allmed

The South African healthcare system is currently facing a period of intense pressure. Between staffing shortages and a rise in medical legal claims, the gap between basic nursing education and the actual demands of patient care is a major concern. To improve patient safety and support our healthcare workers, we must focus on practical, hands-on experience and constant skill building.

Why nursing challenges matter in South Africa

Nursing errors are rarely the fault of one person. In South Africa, they are usually the result of a system under strain. Nurses are dealing with overcrowded wards, long shifts, and a very high number of patients with complex conditions like HIV and TB. When staff are exhausted and overworked, the risk of making a mistake increases.

These errors have a massive impact. For patients and their families, it leads to a loss of trust. For hospitals, it leads to expensive legal battles. South Africa is currently dealing with billions of Rands in medical claims, but this is money that should be spent on better equipment and hiring more people. If we want a stronger healthcare system, we must reduce the risks that lead to these errors in the first place.

Hands-on training makes the difference

Nursing education has traditionally leaned heavily on theoretical learning, but knowing the theory of a procedure is very different from doing it in a busy hospital. Practical, skills-based training is what helps a nurse transition safely from the classroom to the ward.

Donald McMillan, MD at Allmed

One of the most effective tools for this is simulation-based training. This involves using specialised training rooms that look like real hospital wards, complete with advanced mannequins that can mimic medical emergencies. Here, nurses can practice critical skills like inserting drips, reading ECGs, or managing emergency care in a safe environment. This allows them to build confidence and “muscle memory” before they ever treat a real patient. This type of training is essential for preparing nurses for the high-pressure reality of South African clinics.

Continuous professional development builds confidence

Medicine is always changing. New treatment guidelines, technologies, and medicines are introduced all the time, changing the way care is delivered. Continuous Professional Development (CPD) helps nurses keep pace with these changes, ensuring their skills remain relevant, their knowledge up to date, and their patients receive the best possible care throughout every stage of their careers.

However, CPD is about more than just following rules; it is about building professional confidence. When nurses have the chance to learn new things and specialise in areas like intensive care or pharmacology, they feel more capable and valued. In a country where many nurses choose to work overseas, providing these opportunities for growth at home is a great way to keep our best talent in South Africa.

A systemic approach for better care

Enhancing the quality of nursing care in South Africa requires a coordinated, multi-stakeholder approach. Training institutions, hospital administrators, and regulatory bodies must collaborate to create an ecosystem that supports the nurse at every career stage. This systemic approach should focus on three specific areas:

  • Integrated mentorship: Establishing formal programmes where expert clinicians provide real-time bedside teaching to new graduates.
  • Accredited upskilling: Providing accessible pathways for nurses to specialise in critical areas such as ICU, neonatal care, and oncology.
  • Technological alignment: Utilising digital tools to track competency levels and identify specific areas where additional training is required.

By making practical training and ongoing learning a priority, we do more than just prevent mistakes. We empower our nurses to be the skilled professionals they want to be. When nurses are competent and confident, they provide better care, which helps rebuild public trust and makes the South African healthcare system stronger for everyone.

Kaitlin and Lihle’s Fight Against a Rare Blood Disease

Photo by National Cancer Institute on Unsplash

At 25, Kaitlin should be living independently. At 18, Lihle should be finishing school. Instead, both are fighting for their lives against aplastic anaemia (AA), a rare blood disease that leaves patients vulnerable to infections, uncontrolled bleeding, and severe anaemia. A stem cell transplant gives approximately 80% of patients a real chance at recovery, but for around 70% of those patients, that match will not come from within their family. It will come from a generous stranger.

“AA strikes hardest between 15 and 25 – the years nobody expects to spend fighting for their life,” says Palesa Mokomele, Head of Community Engagement and Communication at DKMS Africa. “We want South Africans to understand that registering as a stem cell donor is a simple act that could give someone like Kaitlin or Lihle their life back. Every person who registers increases their chances of finding a match.”

A long road to the right diagnosis: Kaitlin’s story

For years, nobody could tell Kaitlin from KwaZulu-Natal what was wrong. She experienced prolonged and excessive bleeding and severe fatigue, which was repeatedly misattributed to gynaecological issues. She kept going back to the hospital and kept being sent home. It was only in August 2025, when her condition deteriorated dramatically, and the bleeding would not stop despite ongoing treatment, that she was finally referred to a haematologist. A bone marrow biopsy told them what years of tests had missed: Kaitlin had AA.

Before this, she was working full-time and living independently. Today, she cannot work. She cannot manage basic daily tasks. She requires weekly blood transfusions simply to stay alive. Medication trials have yielded no response, and her doctors have been clear: a stem cell transplant is her only path to recovery.

Through it all, Kaitlin has held on. “I draw strength from my faith and from the people I love most – my nephews and siblings, who show up for me even on the hardest hospital days. I just want my life back, and a matching donor could make that possible.”

Sudden illness, endless resilience: Lihle’s story

Lihle was 14 years old when his life changed overnight. It started with severe nosebleeds in November 2021. Then one night, the bleeding became uncontrollable. He lost consciousness. After two months in hospital, the diagnosis came: Severe Aplastic Anaemia (SAA). That same year, his father passed away.

The eldest of four children, Lihle grew up fast. Hailing from Butterworth in the Eastern Cape and raised in Carletonville, Gauteng, he has always felt the weight of being the firstborn – the one his younger siblings look up to. Their mother cares for them all – while also carrying the emotional weight of losing her husband and watching her son fight for his life.

Lihle shares that he is determined to finish his education, set an example, and one day return to the football pitch. Like Kaitlin, all he needs is a matching donor to make that possible.”

How you can help

“No family should have to face what Kaitlin’s and Lihle’s are going through – knowing that a cure exists, but that the donor hasn’t been found yet. For patients from Black, Coloured and Indian/Asian backgrounds, that search is even harder, because the registry does not yet reflect the diversity of our population. We are calling on all South Africans to register. It costs nothing. It takes minutes. And it could mean everything,” concludes Mokomele.

Signing up could be the most important thing you ever do. If you are aged 17 – 55 and in good health, please register today at: https://www.dkms-africa.org/save-lives

Robotic Medical Crash Cart Eases Workload for Healthcare Teams

Photo by Rodnae Productions on Pexels

Healthcare workers have an intense workload and often experience mental distress during resuscitation and other critical care procedures. Although researchers have studied whether robots can support human teams in other high-stakes, high-risk settings such as disaster response and military operations, the role of robots in emergency medicine has not been explored.

Enter Angelique Taylor, the Andrew H. and Ann R. Tisch Assistant Professor at Cornell Tech and the Cornell Ann S. Bowers College of Computing and Information Science. She is also an assistant professor in emergency medicine at Weill Cornell Medicine and director of the Artificial Intelligence and Robotics Lab (AIRLab) at Cornell Tech.

In a pair of articles published at the Institute of Electrical and Electronics Engineers (IEEE) conference on Robot and Human Interactive Communication (RO-MAN) in August 2025, Taylor and her collaborators at Weill Cornell Medicine, associate professor Kevin Ching and assistant professor Jonathan St. George, described research on their new robotic crash cart (RCC) — a robotic version of the mobile drawer unit that holds supplies and equipment needed for a range of medical procedures.

“Healthcare workers may not know or may forget where all the various supplies are located in the cart drawers, and often they’re kind of shuffling through the cart,” Taylor said. This can cause delays during emergency procedures that require iterative tasks with precise timing, exacerbating medical errors and putting patients at risk, she noted.

To create the RCC, Taylor and her team outfitted a standard cart with LED light strips, a speaker, and a touchscreen tablet integrated with the Robot Operating System. This middleware connects computer programs to robot hardware, enabling them to work together to provide users with verbal and nonverbal cues.

During an emergency procedure, a user can request the location of a supply on the tablet. Then the lights around the drawer with that supply blink, or a spoken instruction plays through the speaker. Users can also receive prompts to remind them about necessary medications and recommend supplies.

In their article, “Help or Hindrance: Understanding the Impact of Robot Communication in Action Teams,” Taylor’s team conducted pilot studies of the RCC. One pilot involved 84 participants, aged 21 to 79, about half of whom had a clinical background. Working in groups of 3 to 4, they conducted a series of simulated resuscitation procedures with a manikin patient using three different carts: a RCC with blinking lights for object search and spoken task reminders, a RCC with blinking lights for task reminders and spoken language for object search, or a standard cart.

The team found that participants preferred the RCC that provided verbal and nonverbal cues over no cues with the standard cart — rating it lower in terms of workload and higher in usefulness and ease of use.

“These results were exciting and achieved statistical significance, suggesting that the use of a robot is beneficial,” said Taylor. The article, by Taylor, Ph.D. student Tauhid Tanjim, and colleagues at Weill Cornell, was a Kazuo-Tanie Paper Award finalist, an honor given to the top three papers in their category at the conference.

In the second article, “Human-Robot Teaming Field Deployments: A Comparison Between Verbal and Non-verbal Communication,” the research team began testing the RCC under more realistic conditions. Participants were healthcare workers from across the United States, and actors played frantic family members during the simulations.

Similar to the pilot studies, Taylor, along with colleagues at Cornell and Michigan State University, found that the RCC reduced participant workload, depending on whether the robot provided verbal or non-verbal cues. However, they evaluated robots with only one type of cue, not both, and identified room for improvement, particularly in the robot’s visual cues. They are now studying healthcare workers’ impressions of an RCC with multimodal communication.

Taylor hopes that other research teams will start exploring how robots can support healthcare teams in critical care settings. To that end, Taylor and her colleague presented an article at the February 2025 Association for Computing Machinery/IEEE International Conference that offers a toolkit for researchers to build their own RCC.

By Carina Storrs, freelance writer for Cornell Tech.

Source: Cornell Tech

Study Challenges Notion that Ageing Means Decline – Many Older Adults Improve over Time

Photo by Ravi Patel on Unsplash

Ageing in later life is often portrayed as a steady slide toward physical and cognitive decline. But a new study by scientists at Yale University suggests an alternate narrative – that older individuals can and do improve over time and their mindset toward ageing plays a major part in their success.

Analysing more than a decade of data from a large, nationally representative study of older Americans, lead author Becca R. Levy, a professor of social and behavioural sciences at the Yale School of Public Health (YSPH), found that nearly half of adults aged 65 and older showed measurable improvement in cognitive function, physical function, or both, over time.

The improvements were not limited to a small group of exceptional individuals and, notably, were linked to a powerful but often overlooked factor: how people think about ageing itself.

“Many people equate ageing with an inevitable and continuous loss of physical and cognitive abilities,” said Levy, an international expert on psychosocial determinants of ageing health. “What we found is that improvement in later life is not rare, it’s common, and it should be included in our understanding of the ageing process.”

The findings are published in the journal Geriatrics.

For the study, the researchers followed more than 11 000 participants in the Health and Retirement Study, a federally supported longitudinal survey of older Americans. The research team tracked changes in cognition using a global performance assessment, and physical function using walking speed — often described by geriatricians as a “vital sign” because of its strong links to disability, hospitalisation, and mortality.

Over a follow-up period of up to 12 years, 45% of participants improved in at least one of the two domains, according to the study. About 32% improved cognitively, 28% improved physically, and many experienced gains that exceeded thresholds considered clinically meaningful. When participants whose cognitive scores remained stable over that period (rather than declining) were included, more than half defied the stereotype of inevitable deterioration in cognition.

“What’s striking is that these gains disappear when you only look at averages,” said Levy, author of the book “Breaking the Age Code: How Your Beliefs About Aging Determine How Long & How Well You Live.” “If you average everyone together, you see decline. But when you look at individual trajectories, you uncover a very different story. A meaningful percentage of the older participants that we studied got better.”

The authors also examined potential reasons for why some people improve and some do not. They hypothesized that an important factor could be participants’ baseline age beliefs — or, specifically, whether they had assimilated more positive or more negative views about ageing by the start of the study. In support of this hypothesis, they found that those with more positive age beliefs were significantly more likely to show improvements in both cognition and walking speed, even after accounting for factors such as age, sex, education, chronic disease, depression, and length of follow-up.

The findings build on Levy’s stereotype embodiment theory, which posits that age stereotypes absorbed from culture – through a range of domains including social media and advertisements – eventually become self-relevant and biologically consequential. Levy’s prior studies have found negative age beliefs predict poorer memory, slower walking speed, higher cardiovascular risk, and biomarkers associated with Alzheimer’s disease.

The current study shows that those who have assimilated more positive age beliefs often show improvement, Levy said.

“Our findings suggest there is often a reserve capacity for improvement in later life,” she said. “And because age beliefs are modifiable, this opens the door to interventions at both the individual and societal level.”

The improvements were not limited to people who started out with impairments. Even among participants who had normal cognitive or physical function at baseline, a substantial proportion improved over time. That challenges the assumption that later-life gains reflect only people getting better after being sick or rebounding from earlier setbacks, the authors said.

The authors hope their findings will reverse the popular perception that continuous decline is inevitable and encourage policy makers to increase their support for preventive care, rehabilitation, and other health-promoting programs for older persons that draw on their potential resilience.

Source: EurekAlert!

Why People With Autism May Be More Likely To Get Parkinson’s Disease

Dopamine transporters in the brain could be early biomarkers for the potential development of Parkinson’s disease

Photo by Peter Burdon on Unsplash

Researchers at the University of Missouri may have uncovered a clue explaining why young adults with autism are roughly six times more likely to develop Parkinson’s disease later in life.

In a recent study, the researchers found that some young adults with autism show abnormalities in dopamine transporters, tiny molecules in the brain that recycle unused dopamine, on brain scans that are typically used to diagnose older adults with Parkinson’s disease.

Future research could help determine whether the health of dopamine transporters could be an early warning sign of Parkinson’s disease developing later in life.

“While the loss of these dopamine transporters can be biomarkers for Parkinson’s disease, no one had ever thought to look at them in the context of young adults with autism, so hopefully this work can help us explore if there is a potential link going forward,” David Beversdorf, a professor in the School of Medicine and College of Arts and Science, said. “There has been previous work looking into the total amount of dopamine in the brains of people with autism, but we took a new approach by looking at abnormalities in terms of how dopamine is processed in a specific part of the brain called the basal ganglia via these dopamine transporters.”

Dopamine under the spotlight

Dopamine is a neurotransmitter involved in numerous body functions, such as memory, pleasure, motivation, behaviour and attention. Of particular interest to Beversdorf, a clinician at the Thompson Center for Autism and Neurodevelopment, is that dopamine also helps control muscle movement as well as cognition.

Beversdorf, who collaborated with lead author Nanan Nuraini on the study, originally wanted to know whether certain repetitive behaviors common in some young adults with autism, such as hand-flapping or rocking back and forth, were linked with abnormalities in dopamine transporters.

While he did not notice patterns in that regard, what he found surprised him.

Beversdorf looked at Dopamine Transporter (DaT) brain scans of 12 young adults with autism.

Four different nuclear medicine specialists examined the scans. All of them agreed that two of the 12 young adults had abnormal dopamine transporters and that eight appeared normal. They disagreed on the remaining two.

“Since these DaT scans are typically used to diagnose or evaluate older adults with Parkinson’s disease, the appearance of abnormalities in some young adults with autism was very surprising, so we should look into this topic more going forward,” Beversdorf said. “While it’s too early to jump to conclusions, hopefully our work raises awareness about the importance of monitoring the brain health of young adults with autism as they age.”

Next, Beversdorf hopes to study a broader range of people with autism by conducting more DaT scans across different age groups.

“The earlier we can identify those who might be at greater risk for getting Parkinson’s disease down the road, the sooner we can discuss preventative measures, including whether certain medications could potentially slow down the progression of disease,” Beversdorf said.

Source: University of Missouri

COVID Lockdowns Found to Set Back Children’s Development by Years

Even when controlling for age and family background, COVID’s impact was evident

Photo by Kelly Sikkema on Unsplash

The COVID pandemic disrupted children’s ability to self-regulate, according to research from three UK universities just published in the journal Child Development.

The study by Lancaster University, East Anglia and Durham reveals that the pandemic hampered children’s ability to regulate their behaviour, stay focused and adapt to new situations – skills known collectively as executive functions.

The greatest impact was seen among pupils who were in reception when the first lockdowns began – a crucial stage at four or five when youngsters normally learn to socialise, follow routines and navigate the busy world of the classroom. Primary school in the UK then begins at Grade 1, starting at age five or six.

These children showed less growth in their self-regulatory and cognitive flexibility scores over time compared to a second group of children who were in preschool when the pandemic started.

The research team say these children may still be feeling the effects years later.

How the research happened

Scientists were already running a long-term study tracking youngsters from toddlerhood to early school years when the COVID pandemic hit.

They followed 139 children aged between two-and-a-half and six-and-a-half years old over several years, including 94 families who joined the study before Covid struck.

This meant that they had a rare baseline of children’s abilities before the pandemic began, which allowed them to track exactly how development changed during and after the lockdowns.

Using a standardised assessment called the Minnesota Executive Function Scale, they were able to measure the same cognitive skills at regular intervals.

Dr Eleanor Johns from Lancaster University’s Department of Psychology said: “We began this study to understand how children’s executive function develops across early childhood, and we saw clear, steady growth between 2.5 and 6.5 years of age. However, because our longitudinal study spanned the COVID-19 pandemic, we also had a unique opportunity to examine how this unprecedented disruption affected the children we were already following.

“We found that children who had just started school when the first lockdown began showed a slower rate of growth in executive function compared to those who were preschool age. Starting school is a major developmental transition, as children learn new routines, adapt to classroom rules, and develop self-regulation alongside their peers. When schools closed almost overnight, those opportunities were suddenly removed.”

The research revealed that:

  • Individual differences in executive function abilities were remarkably stable. Children who had stronger skills at two-and-a-half years old tended to remain ahead at six-and-a-half years.
  • Children from lower socio-economic households consistently scored lower, echoing long-standing research on the impact of maternal education and home environment.
  • Even when controlling for age and family background, COVID’s impact was evident. Children who were in reception at the start of the pandemic made more modest improvements in executive function compared to those still in preschool.

Dr Johns said: “Our findings suggest that the structured school environment and regular interaction with peers play a crucial role in supporting the development of executive function. When those experiences were disrupted, children’s executive function developed more slowly than that of younger children who were still in preschool.”

The researchers say their work highlights a generation of children who may need more support from teachers, schools and health services in coming years.

Sources: Lancaster University and University of East Anglia