Author: ModernMedia

New Study Highlights the Hidden Risk of Polypharmacy for Older Adults

Photo by Kampus Production

Inappropriate polypharmacy – the excessive or unnecessary use of multiple medications – is a major driver of emergency hospital admissions among adults aged 65 and over, according to a new study from the Department of Life Sciences at the University of Bath.

The researchers hope their findings will pave the way for the development of a digital tool – such as an app – to proactively identify older adults at risk of medication-related harm and intervene before a hospital visit becomes necessary.

The study, published in Age and Ageing, is the first of its kind to use data-driven methods to explore how potentially inappropriate polypharmacy contributes to short-term hospitalisation in older adults. With this population growing rapidly and facing increased risks of complications from being hospitalised, the findings reinforce concerns in geriatric care over the dangers of overprescribing.

The hidden dangers of overprescribing

Older adults often take multiple medications to manage chronic conditions such as diabetes, hypertension and arthritis. This can lead to prescribing cascades, where side effects from one drug are treated with additional medications, creating a cycle of escalating complexity and risk.

For instance, a patient might be prescribed a drug for pain management, develop high blood pressure as a side effect and then receive another medication to manage that new symptom. Over time, this can lead to a complex web of prescriptions, carrying the risk of harmful interactions.

PhD researcher Robert Olender, who led the study under the supervision of Dr Prasad Nishtala (primary supervisor, from the Department of Life Sciences) and Dr Sandipan Roy (secondary supervisor, from the Depatment of Mathematics) from Bath, said: “With more older adults on complex drug regimens, we need proactive ways to reduce preventable emergency hospitalisations.”

Though the new research is focused on data from the UK, polypharmacy among older adults is known to be a growing problem globally, with studies from countries that include the US, Australia, New Zealand and across Europe consistently linking polypharmacy to increased risks of hospitalisation, adverse drug reactions and reduced quality of life.

In an earlier study based on a New Zealand dataset, also carried out by the team at Bath, a strong correlation was found in older people between a high drug burden, alcohol consumption and smoking with an increased risk of 30-day hospitalisations.

Using machine learning to predict hospitalisation

The study used a large UK dataset to develop three machine learning models capable of predicting 30-day emergency hospitalisation in older adults with around 75% accuracy.

A key variable in these models was the Drug Burden Index (DBI), which measures the cumulative effect of medications with sedative and anticholinergic properties. Anticholinergics are a class of drug used to treat various chronic conditions such as dementia, depression, urinary incontinence and chronic obstructive pulmonary disease (COPD).

The cumulative effects of these drugs consistently emerged as one of the strongest predictors of a person being at risk of emergency hospitalisation. Other predictors included impaired mobility, a history of fractures and falls, smoking and excessive alcohol consumption.

What makes this study unique is its focus on a previously underexplored dataset and age group, offering new insights into a long-standing issue. While the dangers of polypharmacy are well known, this research highlights the link between polypharmacy and short-term hospitalisation. It also lays the groundwork for a potential tool to identify at-risk patients and guide them toward safer care.

From research to real-life impact

The research team envisions an app for clinicians that uses a simple questionnaire to assess a patient’s risk of hospitalisation. Questions might include current prescriptions, lifestyle factors (e.g. smoking and alcohol use) and chronic conditions like cancer or hypertension. The tool would then generate a risk score, allowing clinicians to make informed decisions in real time.

Such a tool could serve as a low-cost, high-impact intervention to keep patients safe and create savings for the NHS. By identifying high-risk patients early, clinicians may adjust medication regimens, encourage physical activity or address modifiable lifestyle factors – simple steps that could significantly reduce an individual’s risk of an emergency hospital admission.

While the app could be developed relatively quickly, integrating it into clinical workflows would require regulatory approval and trials. However, the potential benefits – fewer hospital admissions, improved patient safety, and reduced healthcare costs – make this a compelling investment, the researchers believe.

The team hopes such a tool would raise awareness among healthcare professionals, particularly those in primary care, community pharmacies and hospices, where early intervention could help prevent emergency hospital admissions.

Source: University of Bath

A Revolutionary ‘Single Shot’ Malaria Vaccine Delivery System

Oxford researchers have developed programmable microcapsules to deliver vaccines in stages, potentially eliminating the need for booster shots and increasing immunisation coverage in hard-to-reach communities.

Photo by Mufid Majnun on Unsplash

A team of scientists at the University of Oxford has developed an innovative vaccine delivery system that could allow a full course of immunisation – both initial and booster doses – to be delivered in just one injection. In preclinical trials, the technology provided strong protection against malaria, matching the efficacy of traditional multi-dose vaccination regimens.

Luca Bau, Senior Researcher from the Institute of Biomedical Engineering, said: ‘Reducing the number of clinic visits needed for full vaccination could make a major difference in communities where healthcare access is limited. Our goal is to help remove the barriers that stand in the way of people benefiting from life-saving medical innovations.’

The findings offer hope for a simpler, more effective approach to immunisation, particularly in regions where access to follow-up healthcare is limited.

A new weapon in the fight against preventable diseases

The research, published in Science Translational Medicine, addresses a major challenge in global health: ensuring people return for all required vaccine doses. Missed boosters are one of the biggest barriers to achieving full immunisation, leaving millions vulnerable to preventable infectious diseases.

To tackle this, the Oxford team developed tiny biodegradable capsules that can be co-injected with the first vaccine dose and programmed to release the booster dose weeks or months later. In a mouse model, this “single shot” strategy using the R21 malaria vaccine protected against the disease nearly as effectively as the standard two-dose schedule.

Simple, scalable, and injectable

The microcapsules are made using a patented chip-based microfluidics system that is compatible with existing pharmaceutical production methods. This means the technology can be scaled up rapidly for clinical use and eventual deployment in the field.

Romain Guyon, Post-Doctoral Scientist, inventor of the technology and the lead author on the study said: ‘Our approach solves three of the biggest problems in delayed vaccine delivery: how to make it programmable, injectable, and scalable. The microcapsules are precisely engineered to act as a tiny, timed-release vault, allowing us to dictate exactly when the booster dose is released. We believe this could be a gamechanger not just for malaria but for many other vaccines requiring multiple doses or other complex therapeutic regimens.’

The capsules are made from an approved biodegradable polymer (PLGA) and filled with the R21 malaria vaccine. Once injected, the priming dose works immediately, while the capsules burst within the body to release the booster after a set delay. Researchers were able to fine-tune this delay from two weeks to several months.

Looking ahead

The team is now working to adapt the manufacturing process in preparation for early-stage human trials, attracting interest from pharmaceutical partners and global health organisations.

Anita Milicic, Associate Professor at the Jenner Institute, Nuffield Department of Medicine, said: ‘This is the exciting first step in proving that it is possible to administer the full immunisation complement through a single injection. We now turn to the next challenge: adapting and refining the approach for translation into the clinic, towards ultimately delivering a real-world impact.’

If successful, this technology could revolutionise vaccination campaigns, particularly in areas where logistics and healthcare access make booster schedules impractical. With 20.5 million children missing routine vaccinations in 2022 alone, the implications of a truly single-dose vaccine could be enormous.

Source: University of Oxford

How Much Infants Cry is Mostly Down to Genetics

Photo by William Fortunato

How much an infant cries is largely steered by their genetics and there is probably not much that parents can do about it. This has been shown in a new Swedish twin study from Uppsala University and Karolinska Institutet in which researchers investigated how genetics and environment influence infants’ crying duration, sleep quality and ability to settle during the first months of life.

The study, which was recently published in JCPP Advances, is based on questionnaire responses from parents of 1000 twins spread across Sweden. The parents were asked questions about their children’s sleep, crying and ability to settle when the twins were two months old and then again at five months old. The researchers were interested in finding out how genetics and environment influence these behaviours during the first months of life – something that no study has done before.

The clearest results were seen when the researchers analysed how long the children cried per day.

“What we found was that crying is largely genetically determined. At the age of two months, the children’s genetics explain about 50 per cent of how much they cry. At five months of age, genetics explain up to 70% of the variation. For parents, it may be a comfort to know that their child’s crying is largely explained by genetics, and that they themselves have limited options to influence how much their child cries,” says Charlotte Viktorsson, postdoctoral fellow in psychology and lead author of the study.

The parents who participated in the study were asked how long their children cried, how often they woke up at night, and how long it took until they settled. There was large individual variation between the children. For example, some children could wake up a total of up to 10 times per night. The figures below show averages.

2 months:
Crying duration (per 24 hours): about 72 minutes
Wakeups: 2.2 times per night
Time until settled: about 20 minutes

5 months:
Crying duration (per 24 hours): about 47 minutes
Wakeups: 2.1 times per night
Time until settled: about 14 minutes

The remaining percentage that cannot be explained by genetics was explained by what the researchers call ‘unique environment’ – factors in the children’s environment or life situation that are unique to each child and cannot be identified precisely from the questionnaire responses.

Charlotte Viktorsson is the lead author of the study that investigated how genetics and environment influence infants’ crying, sleep, and ability to settle during the first months of life. Photo: Mikael Wallerstedt

Twin studies reveal the importance of genetics

The participants were recruited by letter, which was sent to families with twins aged 1–2 months. These families were identified from the population register. To be able to capture how much of a behaviour is genetically determined, the researchers compared identical (monozygotic) twins with fraternal (dizygotic) twins. The advantage of studying twins is that they share important factors such as home environment, family situation, and socio-economic status. If identical twins become more similar to each other than fraternal twins in terms of a certain trait, such as how much they cry, it is seen as an expression of the importance of genetics for that trait.

The environment plays a role in infants’ time until settled

Using the same method, the researchers also analysed the number of times the children woke up at night. Here, genetics played less of a role. The number of awakenings during the night was mainly influenced by environmental factors, which can include sleep routines and the environment in which the child sleeps

In the questionnaire, the parents were also asked to state how long it took from the child being put to bed until they were asleep.

“How rapidly the infant settles was primarily due to the environment at 2 months of age, but by 5 months, their genetics had gained some significance. This reflects the rapid development that occurs in infants, and may indicate that parents’ efforts in getting their child to settle may have the greatest impact in the first months,” says Charlotte Viktorsson, who led the study.

However, it is difficult to draw conclusions about which interventions are effective based on this type of observational study.

“Although we cannot see which specific environmental factors influence the number of awakenings during the night, or how long it takes until the child settles, this study points out a direction for future studies with a focus on sleep routines,” she says.

The researchers have followed the twins up to 36 months of age, allowing them to see how sleep and crying change as the children get older. The current study is thus the first in order based on this data.

Source: Uppsala University

Can African Countries Meet 2030 Childhood Immunisation Goals?

Researchers analysed 1 million records from national health surveys in 38 African countries and found progress in childhood immunisation coverage – but many countries, including South Africa, may still fall short of global targets

Maps of childhood immunisation coverage in African countries at regional level for 2020.

Image credit: Nguyen PT et al., 2025, PLOS Medicine, CC-BY 4.0

In the last two decades, childhood immunisation coverage improved significantly across most African countries. However, at least 12 countries, including South Africa, are unlikely to achieve global targets for full immunisation by 2030, according to a new study published July 29th in the open-access journal PLOS Medicine by Phuong The Nguyen of Hitotsubashi University, Japan, and colleagues.

Vaccines are one of the most effective ways to protect children from deadly diseases, yet immunisation coverage is still suboptimal in many African countries. Monitoring and progress in childhood immunisations at the national and local level is essential for refining health programmes and achieving global targets in these countries.

In the new study, researchers used childhood immunisation data contained in approximately 1 million records from 104 nationally representative Demographic and Health Surveys (DHS) conducted in 38 African countries between 2000 and 2019. Using modelling techniques, they estimated immunisation coverage trends through 2030 and assessed disparities across geographic regions and between socioeconomic groups.

The data showed overall improvements in immunisation coverage between 2000 and 2019. It forecast that, if current trends continue, most countries are projected to meet or exceed targets for achieving 80% or 90% coverage of vaccines against tuberculosis, measles, polio, diphtheria, pertussis (whooping cough), and tetanus. However, 12 of 38 countries are not on track to meet full immunisation goals, including high-development nations like South Africa, Egypt, and Congo Brazzaville. The study also pinpointed significant socioeconomic inequalities in coverage, with gaps in coverage of up to 58% between wealth quintiles. While these disparities were present across all countries, most are projected to shrink by 2030 –except in Nigeria and Angola, where inequalities are expected to persist or grow.

“These achievements are likely the result of sustained progress driven by decades of national and sub-national initiatives along with international support aimed at prioritising immunisation,” the authors say. “However, progress towards full immunisation coverage remains slow in 12 African countries examined. In most African nations, challenges related to vaccine affordability, accessibility, and availability remain major obstacles, driven by weak primary healthcare systems and limited resources.”

The authors add, “This study shows that while childhood immunisation coverage has improved in Africa, progress is uneven. Many countries and regions remain off track to meet global targets by 2030.”

The authors conclude, “Conducting this study reinforced how critical reliable sub-national data is for identifying communities being left behind. We hope the findings will help inform more equitable and targeted immunisation strategies.”

Provided by PLOS

Pre-pregnancy Hypoglycaemia Linked with Higher Risk of Preterm Birth, Other Risks

In study of nearly 5 million Chinese women, these links varied according to body mass index

Image by stanias from Pixabay

An analysis of data from more than 4.7 million Chinese women showed that those who had low blood sugar levels prior to conception were more likely to have certain adverse pregnancy outcomes – such as their baby being born preterm or with low birth weight. Hanbin Wu of the Chinese University of Hong Kong, in collaboration with the National Research Institute for Family Planning, presents these findings on July 29th in the open-access journal PLOS Medicine.

Prior research has shown that women who are hyperglycaemic before or during pregnancy are more likely to face adverse pregnancy outcomes, as are women who are hypoglycaemic during pregnancy.

However, few studies have explored whether hypoglycaemia detected before pregnancy is associated with adverse pregnancy outcomes for women without pre-existing diabetes. To help clarify, Wu and colleagues retrospectively analysed data on 4 866 919 Chinese women from the National Free Preconception Checkup Project, a free health service for women planning to conceive. Using data from 2013 to 2016, they analysed associations between preconception hypoglycaemia and pregnancy outcomes.

A total of 239 128 of the women had preconception hypoglycaemia. Compared to those with normal preconception blood sugar, they had a higher risk of certain adverse pregnancy outcomes, such as preterm birth, low birth weight, or birth defects. Women with hypoglycaemia tended to be younger than those with normal blood sugar levels and were more likely to have BMIs in the “underweight” category.

However, the adverse pregnancy risks associated with preconception hypoglycaemia varied for women with different BMIs. For instance, underweight women had a higher risk of miscarriage, while overweight women had a lower risk of their baby being large for their gestational age.

On the basis of these findings, the researchers suggest that screening for preconception hypoglycemia could be explored for its potential to improve pregnancy outcomes. Further research could also address some limitations of this study, such as by including women from other countries and more information on patients’ gestational complications.

The authors state, “In addition to paying attention to women with preconception hyperglycemia, our findings call for increased concern for women with hypoglycemia in preconception glycemic screening. These findings emphasize the importance of preconception examination in preventing and managing reproductive health risks for all women planning to conceive, and also highlight the necessity of comprehensive screening and coordinated interventions for abnormal FPG (fasting plasma glucose) prior to and during pregnancy, which is crucial for advancing the intervention window and mitigating the risk of adverse pregnancy outcomes.”

Provided by PLOS

Research Supports Continued Use of Nasogastric Tube After Oesophageal Cancer Surgery

To the researchers’ surprise, it was not without risk to omit the tube after this surgery. Illustration: Jakob Hedberg

In the largest Nordic study to date concerning oesophageal cancer surgery, the researchers found clear evidence that decompression with a nasogastric tube is associated with less serious complications. Their results challenge a trend of declining use of the nasogastric tube after major surgical procedures. The study was led from Uppsala University and has now been published in Lancet Regional Health Europe.

A number of small studies had previously suggested that it is safe to abandon the tradition of leaving in a decompressing – but for many patients unpleasant – nasogastric tube after surgery to remove oesophageal cancer (gullet cancer). The tube is plastic and runs from the nose down to the stomach, and its use in this particular context is to relieve and reduce pressure in this newly operated area. When the question was discussed in a Nordic research collaboration, it was concluded that these smaller studies lacked sufficient statistical power to justify a change in care. Subsequently, a randomised trial was carried out at 12 university hospitals across Sweden, Norway, Denmark and Finland, where patients were randomised to have or not have a decompressing nasogastric tube in their oesophagus following this type of surgery.

Patients without the tube experienced leakage

To the researchers’ surprise, it was not without risk to omit the tube after this surgery, as more patients without the tube experienced leakage in the anastomosis created during the operation. Leakage must be treated immediately, often with interventions under general anaesthesia, resulting in suffering for the patient and a longer length of hospital stay.

Although no differences in survival rates or other complications were found, this new knowledge may help to reduce suffering for patients in the future.

“Oesophageal cancer is an uncommon form of cancer, with only about 200 operations of this type being performed per year in Sweden. National and international cooperation is therefore absolutely necessary in order to conduct sufficiently large trials to answer the research questions we have. The fact that in just over two years, almost 450 patients have been recruited for the trial surpassed our expectations and represents a great success for this network,” says Jakob Hedberg, surgical oncologist, associate professor at Uppsala University and consultant surgeon at Uppsala University Hospital who is also principal investigator for the study.

“Strong interest has been shown at international conferences where our preliminary results have been presented, and the principle of building surgical care on solid evidence has allowed us to provide the best care to our patients. Another important effect of this successful collaboration is that we can build more clinical trials within the Nordic network which has now been consolidated. In fact, the next clinical trial is already under development,” says Jakob Hedberg.

Source: Uppsala University

Good Prognosis for Men with Prostate Cancer Treated According to Guidelines

Credit: Darryl Leja National Human Genome Research Institute National Institutes Of Health

Most men who are treated for prostate cancer according to modern guidelines have good survival rates and the majority of these men will die of causes other than prostate cancer. This is revealed in a new study from Uppsala University published in the Journal of the National Comprehensive Cancer Network.

“We were surprised by how much life expectancy affected the prognosis. This shows the importance of a thorough assessment of the general health of a man with newly diagnosed prostate cancer. The patient’s life expectancy has a substantial impact on the choice of appropriate treatment strategy,” says Marcus Westerberg, researcher at the Department of Surgical Sciences at Uppsala University, who led the study.

In prostate cancer, the disease progression often takes decades and the risk of dying from prostate cancer therefore depends on both the characteristics of the cancer and life expectancy based on the man’s age and other diseases at the time of diagnosis. Recommendations in guidelines and care programmes are therefore also based on both cancer characteristics and life expectancy. This means that the recommended initial treatment can range from active monitoring for low-risk cancer to combinations of local and systemic treatment for high-risk cancer.

High average age at disease onset

As the average age at diagnosis of prostate cancer is often high and the cancer often progresses very slowly, it is particularly important to know the long-term risk of death from prostate cancer in order to choose the best treatment for patients. Previously, not much has been known about this.

“We wanted to fill that knowledge gap, so we looked at outcomes up to 30 years after the men were diagnosed. In all cases, we had information about the characteristics of the cancer, treatment and the patient’s life expectancy based on age and comorbidity,” says Westerberg.

The researchers used data from the Prostate Cancer Database Sweden (PCBase), which contains information from the National Prostate Cancer Register (NPCR) and other health data registers. They focused on men who had received the recommended treatment for prostate cancer that had not spread in the body. Using statistical modelling, the researchers estimated the lifetime risk of dying from prostate cancer and other causes.

11 per cent risk of dying of cancer

For men with low-risk cancer and short life expectancy (less than 10 years), the risk of dying from prostate cancer was 11% and the risk of dying from other causes was 89% within 30 years of diagnosis.

For men with high-risk cancer (eg stage T3, PSA 30ng/mL and Gleason score 8) and long life expectancy (over 15 years), the risk of dying from prostate cancer was 34% and the risk of dying from other causes was 55% within 30 years of diagnosis.

“We hope that our results will be used to provide a realistic picture of the prognosis for men with prostate cancer. Our study shows that most men who receive the recommended treatment have a good prognosis,” Westerberg concludes.

Life expectancy was based on age and comorbidity. Examples of low-risk cancers are stage T1, PSA 5ng/mL and Gleason score 6. Examples of high-risk cancers are stage T3, PSA 30ng/mL and Gleason score 8.

Source: Uppsala University

Hibernation ‘Superpowers’ May Be Hidden in Human DNA

Photo by Sangharsh Lohakare on Unsplash

Animals that hibernate are incredibly resilient. They can spend months without food or water, muscles refusing to atrophy, body temperature dropping to near freezing as their metabolism and brain activity slow to a crawl. When they emerge from hibernation, they recover from dangerous health changes similar to those seen in type 2 diabetes, Alzheimer’s disease, and stroke.

New genetic research suggests that hibernating animals’ superpowers could lie hidden in human DNA – with clues on how to unlock them, perhaps one day leading to treatments that could reverse neurodegeneration and diabetes.

Two studies describing the results are published in Science.

The genetics of metabolism and obesity

A gene cluster called the “fat mass and obesity (FTO) locus” plays an important role in hibernators’ abilities, the researchers found. Intriguingly, humans have these genes too. “What’s striking about this region is that it is the strongest genetic risk factor for human obesity,” says Chris Gregg, PhD, professor in neurobiology and human genetics at University of Utah Health and senior author on the studies. But hibernators seem able to use genes in the FTO locus in new ways to their advantage.

The team identified hibernator-specific DNA regions that are near the FTO locus and that regulate the activity of neighbouring genes, tuning them up or down. The researchers speculate that adjusting the activity of neighbouring genes, including those in or near the FTO locus, allows hibernators to pack on the pounds before settling in for the winter, then slowly use their fat reserves for energy throughout hibernation.
 
Indeed, the hibernator-specific regulatory regions outside of the FTO locus seem crucial for tweaking metabolism. When the researchers mutated those hibernator-specific regions in mice, they saw changes in the mice’s weight and metabolism. Some mutations sped up or slowed down weight gain under specific dietary conditions; others affected the ability to recover body temperature after a hibernation-like state or tuned overall metabolic rate up or down. 

Intriguingly, the hibernator-specific DNA regions the researchers identified weren’t genes themselves. Instead, the regions were DNA sequences that contact nearby genes and turn their expression up or down, like an orchestra conductor fine-tuning the volume of many musicians. This means that mutating a single hibernator-specific region has wide-ranging effects extending far beyond the FTO locus, explains Susan Steinwand, research scientist in neurobiology at U of U Health and first author on one of the studies.  “When you knock out one of these elements – this one tiny, seemingly insignificant DNA region – the activity of hundreds of genes changes,” she says. “It’s pretty amazing.”
 
Understanding hibernators’ metabolic flexibility could lead to better treatments for human metabolic disorders like type 2 diabetes, the researchers say. “If we could regulate our genes a bit more like hibernators, maybe we could overcome type 2 diabetes the same way that a hibernator returns from hibernation back to a normal metabolic state,” says Elliott Ferris, MS, bioinformatician at U of U Health and first author on the other study.

Uncovering the regulation of hibernation

Finding the genetic regions that may enable hibernation is a problem akin to excavating needles from a massive DNA haystack. To narrow down the regions involved, the researchers used multiple independent whole-genome technologies to ask which regions might be relevant for hibernation. Then, they started looking for overlap between the results from each technique.

First, they looked for sequences of DNA that most mammals share but that had recently changed in hibernators. “If a region doesn’t change much from species to species for over 100 million years but then changes rapidly and dramatically in two hibernating mammals, then we think it points us to something that is important for hibernation, specifically,” Ferris says.

To understand the biological processes that underlie hibernation, the researchers tested for and identified genes that turn up or down during fasting in mice, which triggers metabolic changes similar to hibernation. Next, they found the genes that act as central coordinators, or “hubs,” of these fasting-induced changes to gene activity.

Many of the DNA regions that had recently changed in hibernators also appeared to interact with these central coordinating hub genes. Because of this, the researchers expect that the evolution of hibernation requires specific changes to the controls of the hub genes. These controls comprise a shortlist of DNA elements that are avenues for future investigation.

Awakening human potential

Most of the hibernator-associated changes in the genome appeared to “break” the function of specific pieces of DNA, rather than confer a new function. This hints that hibernators may have lost constraints that would otherwise prevent extreme flexibility in the ability to control metabolism. In other words, it’s possible that the human “thermostat” is locked to a narrow range of continuous energy consumption. For hibernators, that lock may be gone.

Hibernators can reverse neurodegeneration, avoid muscle atrophy, stay healthy despite massive weight fluctuations, and show improved aging and longevity. The researchers think their findings show that humans may already have the needed genetic code to have similar hibernator-like superpowers—if we can bypass some of our metabolic switches. 

“Humans already have the genetic framework,” Steinwand says. “We just need to identify the control switches for these hibernator traits.” By learning how, researchers could help confer similar resilience to humans.

Source: University of Utah Health

Eliminating the Risk of Anaphylaxis from Children’s Peanut Allergy Desensitisation

Photo by Corleto on Unsplash

Oral immunotherapy helps many children with peanut allergy – but for some, it can also trigger severe allergic reactions. In the journal Allergy, a team led by Young-Ae Lee explains what might be behind these differences and how treatment could become more personalised.

Peanut allergy is one of the most common – and most dangerous – food allergies. Tiny amounts of the protein-rich legumes can be enough to cause allergic reactions like itching and swelling, or even life-threatening anaphylaxis. For a long time, the only solution was to avoid peanuts as vigilantly as possible. Since many foods may contain traces of peanuts, that’s still a major challenge, especially for parents of affected children. Emergency medication must always be close at hand.

Recently, oral desensitisation has become available for children with peanut allergies. “Some children respond well to this treatment, but others don’t benefit at all,” says Professor Young-Ae Lee, Group Leader of the Molecular Genetics of Allergic Diseases lab at the Max Delbrück Center. “In some cases, the therapy – based on gradually increasing doses of peanut allergens – can even trigger anaphylactic reactions.”

A team led by Lee and Professor Kirsten Beyer, Head of the Pediatric Allergy Clinical Research Center at Charité – Universitätsmedizin Berlin, has now investigated why children respond so differently to the therapy and how to make it safer and more effective. Their study, published in “Allergy,” was led by first author Dr Aleix Arnau-Soler, a scientist in Lee’s lab. “We looked for molecular changes in the immune systems of children undergoing oral immunotherapy ¬– and we found them,” explains Arnau-Soler.

Gut immune cells play a key role

For their study, the researchers analyzed blood samples from 38 children, with an average age of seven, who were undergoing oral desensitization for peanut allergy at Charité. The team measured levels of immunoglobulins, which are allergy-related antibodies, and cytokines, which are inflammatory messengers, before and after therapy. 

Our results open the door to personalised approaches to treating peanut allergy – which affects three per cent of all children in industrialised countries – more effectively and safely in the future.

Young-Ae LeeHead of the “Molecular Genetics of Chronic Inflammation and Allergic Disease” lab

They also assessed how much peanut protein each child could tolerate before and after treatment – essentially, how successful the desensitization was. To delve deeper, they used modern omics technologies to identify which genes in the children’s immune cells were activated when they were exposed to peanut proteins in the lab.

“Children who responded well to the therapy already had a less reactive immune system before treatment began. Their blood showed lower levels of immunoglobulins and cytokines,” explains Arnau-Soler. These findings could help identify in advance which children are most likely to benefit from desensitization – and those who are at higher risk of side effects.

The team also found consistent differences in gene expression and DNA methylation patterns between children who responded well and those who didn’t. Methylation plays a key role in regulating gene activity. “These differences were particularly pronounced in certain immune cells that are rarely found in the blood, but more common in the gut, where they perform important functions,” says Arnau-Soler. These included both specialized T cells, part of the adaptive immune system, and cells involved in the body’s innate defenses.

New biomarkers pave the way for personalized therapy

“Our results open the door to personalized approaches to treating peanut allergy – which affects three percent of all children in industrialized countries – more effectively and safely in the future,” says Lee. “We now have potential biomarkers to find out how well a child will respond to the therapy and what risks are associated with it in each individual case, even before the therapy begins.” It may soon be possible to tailor the length of treatment and the amount of peanut allergen given to each child’s unique immune profile.

The team is currently working to validate their findings in a follow-up study. They also plan to further investigate the gut-associated immune cells found in blood. “At the same time, we’re developing a predictive model so that in the future we can use a simple blood test to better tailor oral desensitization to the individual child,” adds Arnau-Soler. That could make peanut allergy far less frightening for families.

Source: Max Delbrück Center for Molecular Medicine

​​​​Artificial Sweetener May Be a Greater Diabetes Risk than Sugar

Photo by Breakingpic on Pexels

An Australian study has found that drinking just one can of artificially sweetened soft drink a day may increase the risk of developing type 2 diabetes by 38 per cent.​​​

​​​​Surprisingly, that risk for artificially sweetened soft drink is even higher than for those who consume sugar-sweetened beverages, such as regular soft drinks, where the risk was found to be 23 per cent higher.​​​

​​​​The research, conducted by a team from Monash University together with RMIT University and the Cancer Council Victoria, followed more than 36 000 Australian adults over nearly 14 years.​​​

​​​​The study – led by Distinguished Professor Barbora de Courten from Monash University and RMIT University, Associate Professor Allison Hodge, from the Cancer Council Victoria, and Monash PhD student Robel Hussen Kabthymer, and published in ​​Diabetes & Metabolism​​ – adds to growing global concern about the health effects of both sugary and artificially sweetened drinks.​​​

​​​​“Drinking one or more of these beverages each day – whether sweetened with sugar or artificial substitutes – was linked to a significantly higher chance of developing type 2 diabetes,” said Mr Hussen Kabthymer.​​​

​​​​Professor de Courten, senior author on the study, said the findings challenge the common assumption that artificially sweetened beverages are a safer choice.​​​

​​​​“Artificial sweeteners are often recommended to people at risk of diabetes as a healthier alternative, but our results suggest they may pose their own health risks,” she said.​​​

​​​​While the link between sugary drinks and diabetes could largely be explained by obesity, the connection between artificially sweetened drinks and type 2 diabetes remained strong even after adjusting for body weight, suggesting a potentially direct effect on metabolism.​​​

​​​​Professor de Courten said the findings have important implications for public health policy.​​​

​​​​“We support measures like sugary drink taxes, but our study shows we also need to pay attention to artificially sweetened options. These are often marketed as better for you; yet may carry their own risks. Future policies should take a broader approach to reducing intake of all non-nutritive beverages.”​​​

​​​​The study analysed data from the long-running Melbourne Collaborative Cohort Study, also known as Health 2020, involving participants aged 40–69 years, and adjusted for diet, exercise, education, and health history.​​​

Source: Monash University