Day: March 6, 2026

AI Tools for Cancer Rely on Shaky Shortcuts

Small cell lung cancer cells (green and blue) that metastasised to the brain in a laboratory mouse recruit brain cells called astrocytes (red) for their protection. Credit: Fangfei Qu

Artificial intelligence tools are increasingly being developed to predict cancer biology directly from microscope images, promising faster diagnoses and cheaper testing. But new research from the University of Warwick, published in Nature Biomedical Engineering, suggests that many of these systems may be using visual shortcuts rather than true biology – raising concerns that some AI pathology tools are currently too unreliable for real-world patient care.

“It’s a bit like judging a restaurant’s quality by the queue of people waiting to get in: it’s a useful shortcut, but it’s not a direct measure of what’s happening in the kitchen,” says Dr Fayyaz Minhas, Associate Professor and principal investigator of the Predictive Systems in Biomedicine (PRISM) Lab in the Department of Computer Science, University of Warwick, and lead author of the study.

“Many AI pathology models are doing the same thing, relying on correlations between biomarkers or on obvious tissue features, rather than isolating biomarker-specific signals. And when conditions change, these shortcuts often fall apart.”

To reach this conclusion, the researchers analysed more than 8000 patient samples across four major cancer types – breast, colorectal, lung and endometrial – and compared the performance of leading machine learning approaches. While the models often achieved high headline accuracy, the team found this frequently came from statistical “shortcuts.”

For example, instead of detecting mutations in the cancer-associated BRAF gene, a model might learn that BRAF mutations often occur alongside another clinical feature such as microsatellite instability (MSI). The system then learns to use this combination of cues to predict BRAF status rather than learning the causal BRAF signal itself – meaning accurate cancer predictions work only when these biomarkers co-occur and become unreliable when they do not.

Kim Branson, SVP Global Head of Artificial Intelligence and Machine Learning, GSK and co-author says, “We’ve found that predicting a BRAF mutation by looking at correlated features like MSI is often like predicting rain by looking at umbrellas – it works, but it doesn’t mean you understand meteorology.

“Crucially, if a model cannot demonstrate information gain above a simple pathologist-assigned grade, we haven’t advanced the field; we’ve just automated a shortcut. The roadmap for the next generation of pathology AI isn’t necessarily bigger models; it’s stricter evaluation protocols that force algorithms to stop cheating and learn the hard biology.”

When performance of AI models was assessed within stratified patient subgroups, such as only high-grade breast cancers or only MSI-positive tumours, accuracy fell substantially, revealing that the models were dependent on shortcut signals that disappear once confounding factors are controlled.

For certain prediction tasks, the performance advantage of deep learning over human-derived clinical information was modest. AI systems achieved accuracy scores of just over 80% when predicting biomarkers, compared with around 75% using tumour grade alone – a measure already assessed by pathologists.

Machine learning methods can still prove valuable for research, drug development candidate screening and for clinical triaging, screening, or supplementary decision support. However, the researchers argue that future AI tools must move beyond correlation-based learning and adopt approaches that explicitly model biological relationships and causal structure.

They also call for stronger evaluation standards, including subgroup testing and comparison against simple clinical baselines, before looking at deployment in routine care.

Dr Minhas concludes, “This research is not a condemnation of AI in pathology. It is a wake-up call. Current models may perform well in controlled settings but rely on statistical shortcuts rather than genuine biological understanding. Until more robust evaluation standards are in place, these tools should not be seen as replacements for molecular testing, and it is essential that clinicians and researchers understand their limitations and use them with appropriate caution.”

Source: University of Warwick

How Food Shortages Reprogram the Immune Response to Infection

Human neutrophils visualised under a confocal microscope with cell membrane (red) and nucleus (blue). When faced with an infection during food scarcity, stress hormones trigger an immune response dependent on neutrophils, abundant cells that act as immediate, short-lived defenders. Credit: Thai Tran, National Institute of Arthritis and Musculoskeletal and Skin Diseases

When food is scarce, stress hormones direct the immune system to operate in “low power” mode to preserve immune function while conserving energy, according to researchers at Weill Cornell Medicine. This reconfiguration is crucial to combating infections amid food insecurity.

“Both famine and infectious disease have been with us throughout our evolutionary history and often occurred at the same time. Yet little is known about how nutrition affects the immune system,” said senior author Dr Nicholas Collins, an assistant professor of immunology, and a member of the Jill Roberts Institute for Research in Inflammatory Bowel Disease and the Friedman Center for Nutrition at Weill Cornell.

The answer could be important in helping those who are food insecure and face the risk of infectious diseases every day. “Mounting an immune response against infections requires a lot of energy. We have discovered a coordinated system that upholds immune function by shifting the composition and metabolism of immune cells,” Dr Collins said.

The results, published in Immunity, found that mice on a calorie-restricted diet fought off infection as well as mice that were fully fed, but did so while using very little glucose. This was possible thanks to glucocorticoids, stress hormones known for their role in regulating blood glucose. The researchers determined that glucocorticoids acted like master conductors, reorganizing immune cells and their energy usage to provide a survival advantage.

The research was co-led by Luisa Menezes-Silva, a visiting graduate student from the University of São Paulo, Brazil; Dr Mingeum Jeong, a postdoctoral associate; and Dr Seong-Ji Han, a research associate, all in the Collins lab at Weill Cornell.

Shifting Priorities

To understand the complex interactions involved in an immune response during scarcity, Dr Collins and his team put mice on a 50% restricted-calorie diet and then exposed the animals to bacteria that infect the gut. The mice that were fed a standard diet experienced a metabolic crash – their blood glucose levels and body weight plummeted.

The researchers had expected this would happen to all the animals because mounting an immune response can consume up to 30% of the entire body’s fuel reserves. But in the calorie-restricted mice, the immune system appeared to be functioning perfectly well without using much glucose.

To unravel this enigma, the researchers inventoried the immune cells of the infected animals and discovered that T cells, which normally target invading microbes, were depleted in the calorie-restricted mice. Instead, short-lived neutrophils, which serve as the body’s first responders to infection, were ramped up to twice the normal amount and had measurably enhanced pathogen-killing abilities. The cells seemed to be operating in energy-saving mode, consuming much less glucose than neutrophils from well-fed animals.

“So, this hormone rewires the immune system to eliminate the infection while keeping blood sugar from dropping, which rescues the calorie-restricted animals from malnutrition,” said Dr Collins.

Stress Hormones Lead the Charge

The researchers are breaking new ground by outlining how a sudden fall in food intake triggers glucocorticoid levels to rise, resulting in two major shifts. First, the body repositions certain immune cells – especially naïve T cells – into the bone marrow, which becomes a kind of “safe house” for when the cells are needed. Second, during an infection, glucocorticoids tilt the immune response away from energy‑intensive T cells toward neutrophils, abundant cells that act as immediate, short-lived defenders.

Beyond clearing a current infection, glucocorticoids prepare the immune system for repeat encounters with infectious agents. While the hormones direct killer T cells to stand down and neutrophils to step up, they also ensure memory T cells are preserved for future confrontations.

“Glucocorticoids reduce the immune cells that use up the most energy, while saving those that are critical for protection against future infections,” Dr Collins said. “So, these hormones are involved in every step of the infection-fighting process.”

“Since glucocorticoids are induced not only by nutrient restriction but also by any form of stress, our findings might have broader applicability,” said Dr Collins.

In the meantime, he and his team plan to explore what causes the system to fail when the degree and duration of calorie restriction are more severe. “We looked at reduced food intake over three weeks,” he said. “But when you cross the threshold into malnutrition, the whole system breaks down.” Understanding this collapse could inform better strategies to prevent infectious disease and infection-driven malnutrition in vulnerable populations.

Source: Weill Cornell Medicine

Brain Stimulation can Nudge People to Behave Less Selfishly

Alternating current stimulation in the frontal and parietal lobes of the brain promoted altruistic choices

Photo by ROCKETMANN TEAM

Stimulating two brain areas, nudging them to collectively fire in the same way, increased a person’s ability to behave altruistically, according to a study published February 10th in the open-access journal PLOS Biology by Jie Hu from East China Normal University in China and colleagues from University of Zurich in Switzerland.

As parents raise their kids, they often work to teach them to be kind and to share, to think about other people and their needs – to be altruistic. This unselfish attitude is critical if a society is going to function. And yet, while some people grow up to devote themselves to others, other people still manage to grow up selfish.

To understand what brain areas and connections might underlie individual differences in altruism, the researchers asked 44 participants to complete 540 decisions in a Dictator Game – offering to split an amount of money with someone else, which they then got to keep. Each time, the participant could make more or less money than their partner, but the amounts varied. As the participants played the game, the researchers stimulated their brains with transcranial alternating current stimulation over the frontal and parietal lobes of the brain. The stimulation was set up to make the brain cells in those areas fire together in repetitive patterns, training them all to either gamma or alpha oscillation rhythms.

The authors found that during the alternating current stimulation designed to enhance the synchrony of gamma oscillations in the frontal and parietal lobes, the participants were slightly more likely to make an altruistic choice and offer more money to someone else – even when they stood to make less money than their partner. Using a computational model, the researchers showed that the stimulation nudged the participants’ unselfish preferences, making them consider their partner more when they weighed each monetary offer. The authors note that they did not directly record brain activity during the trials, and so future studies should combine brain stimulation with electroencephalography to show the direct effect of the stimulation on neural activity. But the results suggest that altruistic choices could have a basis in the synchronized activity of the frontal and parietal lobes of the brain.

Coauthor Christian Ruff states, “We identified a pattern of communication between brain regions that is tied to altruistic choices. This improves our basic understanding of how the brain supports social decisions, and it sets the stage for future research on cooperation – especially in situations where success depends on people working together.”

Coauthor Jie Hu notes, “What’s new here is evidence of cause and effect: when we altered communication in a specific brain network using targeted, non-invasive stimulation, people’s sharing decisions changed in a consistent way – shifting how they balanced their own interests against others’.”

Coauthor Marius Moisa concludes, “We were struck by how boosting coordination between two brain areas led to more altruistic choices. When we increased synchrony between frontal and parietal regions, participants were more likely to help others, even when it came at a personal cost.”

Provided by PLOS

Scientists Engineer ‘Living Eye Drop’ to Support Corneal Healing

Photo by Victor Freitas on Pexels

University of Pittsburgh School of Medicine researchers have developed an early-stage, experimental “living eye drop” that uses a naturally occurring eye bacterium to support corneal wound healing.

The proof-of-‑concept study, published in Cell Reports, demonstrates that the harmless eye-dwelling microbe Corynebacterium mastitidis can be genetically modified to secrete an anti-inflammatory therapeutic that promotes healing following corneal injury in a mouse model.

“This is the first demonstration that a microbe that lives on the ocular surface could be engineered to deliver a therapeutic that improves eye health,” said senior author Anthony St. Leger, associate professor of ophthalmology and of immunology and a faculty member of the UPMC Vision Institute. “It opens the door to the idea of ‘living medicine’ for the eye – something you apply once, and it stays, protects and helps the tissue heal.”

Because tears continually wash medications away, treating ocular surface disease often requires multiple daily applications of eye drops. This can limit the effectiveness of therapies for conditions such as corneal abrasions or dry eye disease.

To explore an alternative delivery method, the Pitt team engineered C. mastitidis, a benign bacterium that naturally resides under the eyelid, to continuously secrete cytokine interleukin10 (IL10). In mice, corneas that were gently scratched and treated with the engineered bacteria healed faster than those treated with regular bacteria or saline. When the IL10 receptor was blocked, this benefit disappeared – confirming the therapeutic effect was IL10-dependent.

The researchers also created a version of the microbe that releases human IL10, which improved wound closure in lab-grown cells that make up the outermost layer of human cornea and reduced inflammatory signaling in human immune cells. These studies offer an initial indication that the approach could eventually be adapted for use in people, though substantial development remains.

“What makes this exciting is that the system is modular,” St. Leger explained. “We built it so you can swap in different genes – different cytokines, growth factors or other proteins – to tailor the therapy to specific eye diseases.”

Though promising, the technology is still in early development. The researchers note that many steps must be completed before any clinical translation is possible, including developing built-in “off switches”  to safely and reliably remove or deactivate the engineered bacteria after they are no longer needed.

Source: University of Pittsburgh

Addressing Nursing Challenges in South Africa Through Practical Training and Ongoing Development

Photo by Thirdman

By Donald McMillan, MD at Allmed

The South African healthcare system is currently facing a period of intense pressure. Between staffing shortages and a rise in medical legal claims, the gap between basic nursing education and the actual demands of patient care is a major concern. To improve patient safety and support our healthcare workers, we must focus on practical, hands-on experience and constant skill building.

Why nursing challenges matter in South Africa

Nursing errors are rarely the fault of one person. In South Africa, they are usually the result of a system under strain. Nurses are dealing with overcrowded wards, long shifts, and a very high number of patients with complex conditions like HIV and TB. When staff are exhausted and overworked, the risk of making a mistake increases.

These errors have a massive impact. For patients and their families, it leads to a loss of trust. For hospitals, it leads to expensive legal battles. South Africa is currently dealing with billions of Rands in medical claims, but this is money that should be spent on better equipment and hiring more people. If we want a stronger healthcare system, we must reduce the risks that lead to these errors in the first place.

Hands-on training makes the difference

Nursing education has traditionally leaned heavily on theoretical learning, but knowing the theory of a procedure is very different from doing it in a busy hospital. Practical, skills-based training is what helps a nurse transition safely from the classroom to the ward.

Donald McMillan, MD at Allmed

One of the most effective tools for this is simulation-based training. This involves using specialised training rooms that look like real hospital wards, complete with advanced mannequins that can mimic medical emergencies. Here, nurses can practice critical skills like inserting drips, reading ECGs, or managing emergency care in a safe environment. This allows them to build confidence and “muscle memory” before they ever treat a real patient. This type of training is essential for preparing nurses for the high-pressure reality of South African clinics.

Continuous professional development builds confidence

Medicine is always changing. New treatment guidelines, technologies, and medicines are introduced all the time, changing the way care is delivered. Continuous Professional Development (CPD) helps nurses keep pace with these changes, ensuring their skills remain relevant, their knowledge up to date, and their patients receive the best possible care throughout every stage of their careers.

However, CPD is about more than just following rules; it is about building professional confidence. When nurses have the chance to learn new things and specialise in areas like intensive care or pharmacology, they feel more capable and valued. In a country where many nurses choose to work overseas, providing these opportunities for growth at home is a great way to keep our best talent in South Africa.

A systemic approach for better care

Enhancing the quality of nursing care in South Africa requires a coordinated, multi-stakeholder approach. Training institutions, hospital administrators, and regulatory bodies must collaborate to create an ecosystem that supports the nurse at every career stage. This systemic approach should focus on three specific areas:

  • Integrated mentorship: Establishing formal programmes where expert clinicians provide real-time bedside teaching to new graduates.
  • Accredited upskilling: Providing accessible pathways for nurses to specialise in critical areas such as ICU, neonatal care, and oncology.
  • Technological alignment: Utilising digital tools to track competency levels and identify specific areas where additional training is required.

By making practical training and ongoing learning a priority, we do more than just prevent mistakes. We empower our nurses to be the skilled professionals they want to be. When nurses are competent and confident, they provide better care, which helps rebuild public trust and makes the South African healthcare system stronger for everyone.