Chronic Pain May Increase Hypertension Risk in Adults

Depression resulting from pain may be a contributing factor in the development of high blood pressure, finds a new study

Credit: Pixabay CC0

Chronic pain in adults may increase their risk of high blood pressure, and the location and extent of pain and if they also had depression were contributing factors, according to new research published in Hypertension, an American Heart Association journal.

An analysis of health data for more than 200 000 adults in the US found that those who reported chronic pain throughout their bodies were more likely to develop high blood pressure than people who reported no pain, short-term pain or pain limited to specific areas.

“The more widespread their pain, the higher their risk of developing high blood pressure,” said lead study author Jill Pell, MD, CBE, Professor of Public Health at the University of Glasgow. “Part of the explanation for this finding was that having chronic pain made people more likely to have depression, and then having depression made people more likely to develop high blood pressure. This suggests that early detection and treatment of depression, among people with pain, may help to reduce their risk of developing high blood pressure.”

High blood pressure and hypertension occurs when the force of blood pushing against the walls of blood vessels is too high, and it increases the risk of heart attack or stroke. High blood pressure as well as stage one or stage two hypertension, which includes blood pressure measures from 130/80mmHg to 140/90mmHg or higher, affects nearly half of all adults in the US, and is the leading cause of death in the US and around the world, according to the 2025 joint American Heart Association/American College of Cardiology guideline endorsed by 11 other organisations.

According to previous research, chronic musculoskeletal pain – pain in the hip, knee, back or neck/shoulder that lasts for at least three months – is the most common type of pain in the general population. This study investigated the associations between the type, location and extent of pain throughout the body and the development of high blood pressure.

Inflammation and depression are both known to raise the risk of high blood pressure; however, no prior studies have examined the extent to which the link between pain and high blood pressure is mediated through inflammation and depression, Pell said.

In this study, participants completed a baseline questionnaire and provided information about whether they had experienced pain in the last month that interfered with their usual activities. They noted if the pain was in their head, face, neck/shoulder, back, stomach/abdomen, hip, knee or all over their body. If they reported pain, they indicated whether pain had persisted for more than three months.

Depression was gauged based on participants’ responses to a questionnaire that asked about the frequency of depressed mood, disinterest, restlessness or lethargy in the previous two weeks. Inflammation was measured with blood tests for C-reactive protein (CRP).

After an average follow-up of 13.5 years, the analysis found:

  • Nearly 10% of all participants developed high blood pressure.
  • Compared to people who did not have pain, people with chronic widespread pain had the highest risk of high blood pressure (75% increased risk), while short-term pain was associated with a 10% higher risk and chronic localized pain was linked with a 20% higher risk.
  • When comparing sites of pain to people without pain, the analysis showed that chronic, widespread pain was associated with a 74% higher risk of developing high blood pressure; chronic abdominal pain with a 43% higher risk; chronic headaches with a 22% higher risk; chronic neck/shoulder pain with a 19% higher risk; chronic hip pain with a 17% higher risk; and chronic back pain with a 16% higher risk.
  • Depression (11.3% of participants) and inflammation (0.4% of participants) accounted for 11.7% of the association between chronic pain and high blood pressure.

“When providing care for people with pain, health care workers need to be aware that they are at higher risk of developing high blood pressure, either directly or via depression. Recognising pain could help detect and treat these additional conditions early,” Pell said.

Daniel W. Jones, MD, FAHA, chair of the 2025 American Heart Association/American College of Cardiology High Blood Pressure Guideline and dean and professor emeritus of the University of Mississippi School of Medicine in Jackson, Mississippi, said, “It is well known that experiencing pain can raise blood pressure in the short term, however, we have known less about how chronic pain affects blood pressure. This study adds to that understanding, finding a correlation between the number of chronic pain sites and that the association may be mediated by inflammation and depression.”

Jones, who was not involved in this research, suggests further exploration of the relationship through randomized controlled trials of approaches to pain management and blood pressure, especially the use of Nonsteroidal Anti-Inflammatory Drugs (NSAIDs) such as ibuprofen, which may also cause an increase in blood pressure.

“Chronic pain needs to be managed within the context of the patients’ blood pressure, especially in consideration of the use of pain medication that may adversely affect blood pressure,” said Jones.

The study’s limitations include that participants were middle- and older-aged adults who were mainly white people of British origin; therefore, the study’s findings may not be generalizable to people from other racial or ethnic groups, living in other countries or adults in other age groups. In addition, the information about levels of pain was self-reported, and the study relied on clinical diagnostic coding, a one-time pain assessment and two blood pressure measurements.

Study details, background and design:

  • The study reviewed data from the UK Biobank, a large population-based study that recruited more than 500 000 adults who were ages 40-69 when they joined the study between 2006 and 2010. Participants lived in England, Scotland and Wales.
  • This analysis included 206,963 adults. The average age of the participants was 54 years; 61.7% were women, and 96.7% were white adults.
  • Among all participants, 35.2% reported experiencing chronic musculoskeletal pain; 62.2% reported chronic pain at one site of the body; 34.9% reported chronic pain at two to three musculoskeletal sites; and 3.2% reported pain at four sites.
  • When compared with participants who reported no pain, participants reporting pain were more likely to be women, have an unhealthy lifestyle, larger waist circumference, higher body mass index (BMI), more long-term health conditions and live in areas with higher unemployment, lower home and car ownership and more overcrowding.
  • The researchers adjusted for factors associated with both pain and high blood pressure, including self-reported smoking status, alcohol consumption, physical activity, total sedentary time, sleep duration, and fruit and vegetable intake.
  • UK Biobank data was collected at the participants’ baseline appointment through a touch-screen questionnaire, interview, physical measurements (height, weight, BMI, waist circumference, blood pressure measurement) and blood samples taken for cholesterol and blood sugar (hemoglobin A1c).
  • The participants’ hospital records identified incidences of high blood pressure, which were defined using the standard International Statistical Classification of Diseases and Related Problems and diagnostic codes (ICD-10 codes).
  • The study’s follow-up duration was determined by measuring the time from the baseline date until one of the following events occurred: a recorded diagnosis of high blood pressure, the participant’s death or censoring due to reaching the end of follow-up records. The earliest of these events marked the end of the follow-up period for each participant.

Co-authors, disclosures and funding sources are listed in the manuscript.

Social Media Use Drives Distrust Among Gen Z Teenage Girls

Photo by Freestocks on Unsplash

Social media use in adolescence is linked to delayed bedtimes, negative self-image and, especially among teenage girls, greater distrust, shows a new study from University College London. In turn, these changes are associated with more symptoms of depression and anxiety, risk of self-harm, and suicidal behaviours several years later. 

Published in Social Psychiatry and Psychiatric Epidemiology, the study examined how use of social media on the cusp of adolescence (11-years-old) was indirectly associated with a range of psychiatric symptoms, including psychological distress, self-harm and suicidal behaviours, in late adolescence (17-years-old). 

The study found three mechanisms linking social media use in early adolescence to small overall increases in subsequent mental health problems. Both boys and girls who were using social media from early on (at age 11) tended to sleep a little later on average, and had more negative thoughts about their physical appearance at age 14, compared to those who had not used social media. Crucially, teenage girls who had been using social media at age 11 reported greater distrust of other people at age 14.  

The three key mechanisms, which involved later bedtimes, more negative perceptions of body image, and distrust, mediated the association between early social media use and subsequent mental health problems. These small but significant relationships held true even after adjusting for socioeconomic and demographic factors, any maternal mental health problems, and children’s prior mental health difficulties (at age 7).  

The findings were based on data from the UK’s nationally representative Millennium Cohort Study, which was designed to track the lives of around 19 000 children born in 2000 to 2001 (and who belong to ‘Gen Z’, that is, children born between 1997 and 2012).  

During 2011-2012, at around age 11, the participants were asked: “How often do you visit a social networking website on the internet, such as Facebook or Bebo?”. Around three years later, they were followed up and asked about their usual bedtime, their trust in others, and their self-perception. A range of mental health challenges were subsequently tracked another three years later, at age 17. 

Lead author, Dr Dimitris Tsomokos (UCL Institute of Education) said: “These findings suggest that interpersonal distrust was a significant driver of psychiatric symptoms among Gen Z girls who used social media from early adolescence. 

“This distrust of others may be a particularly female response to the pressures of social media, which can sadly be fertile ground for social comparison, cyberbullying and perceived exclusion.” 

“We know that teenage girls display more empathetic concern and tend to place higher value on reciprocal relationships, and perhaps this is what drives greater distrust among them.” 

As policymakers and parents grapple with how to navigate technology use in childhood, the study’s authors recommend greater intervention in early adolescence, focused on fostering a sense of trust and social safety. They believe this can help mitigate the negative impacts of social media usage on young people’s long term mental health. 

Source: University College London

Low-dose Colchicine May Reduce Risk of Heart Attack and Stroke

Researchers found that low-dose colchicine can be used to reduce heart attacks and strokes in high-risk patients

Photo by Towfiqu Barbhuiya on Unsplash

A widely-used, inexpensive gout drug could reduce heart attacks and strokes in people with cardiovascular disease, according to a new Cochrane review.

The review examined the effects of low doses of colchicine, a drug used to treat gout, and found no increase in serious side effects.

Cardiovascular disease is often driven by chronic low-grade inflammation, which contributes to recurrent cardiovascular events such as heart attacks and strokes. Colchicine has anti-inflammatory properties that make it a promising option for people with heart disease. 

A promising effect on cardiovascular risk

The review included 12 randomised controlled trials involving nearly 23 000 people with a history of heart disease, heart attack or stroke. The studies looked at patients who took colchicine for at least six months, with doses of 0.5mg once or twice a day. Most participants were male (~80%) and the mean age was 57 to 74 years old. Half received colchicine, while the other half received either a placebo or no additional treatment alongside their usual care.

Overall, those taking low-dose colchicine were less likely to experience a heart attack or stroke. For every 1000 people treated, there were 9 fewer heart attacks and 8 fewer strokes compared with those not taking the drug. Whilst there were no serious adverse events identified, patients who took colchicine were more likely to have stomach or digestive side effects, but these were usually mild and didn’t last long.

Among 200 people with cardiovascular disease – where we would normally expect around seven heart attacks and four strokes – using low-dose colchicine could prevent about two of each. Reductions like this can make a real difference for patients who live with ongoing, lifelong cardiovascular risk.

– Dr Ramin Ebrahimi, co-lead author from the University Medicine Greifswald, Germany

A new use for a long-established medicine

As cardiovascular diseases are the leading cause of death globally, colchicine presents a promising inexpensive and accessible option for secondary prevention in high-risk patients.

These results come from publicly funded trials repurposing a very old, low-cost drug for an entirely new use. It shows the power of academic research to reveal treatment opportunities that traditional drug development often overlooks.

 – Lars Hemkens, senior author from the University of Bern, Switzerland

The evidence is less clear when it comes to whether colchicine affects overall death rates or the need for procedures like coronary revascularisation. The studies didn’t provide any information to say whether the drug improves quality of life or reduces hospital stays. The authors stress that further research is needed in these areas.

Source: Cochrane

Asymptomatic Colonisers Drive the Spread of Drug-resistant Infections in Hospitals

The computer model improves on traditional methods like contact tracing by inferring asymptomatic carriers in the spread of antibiotic-resistant infections

Photo by Hush Naidoo Jade Photography on Unsplash

A new analytical tool can improve a hospital’s ability to limit the spread of antibiotic-resistant infections over traditional methods like contact tracing, according to a new study led by researchers at Columbia University Mailman School of Public Health and published in the peer-reviewed journal Nature Communications. The method infers the presence of asymptomatic carriers of drug-resistant pathogens in the hospital setting, which are otherwise invisible.

Antimicrobial resistance (AMR) is an urgent threat to human health. In 2019, 5 million deaths were associated with an AMR infection globally.

The inference framework developed by Columbia Mailman School researchers is the first to combine several data sources – patient mobility data, clinical culture tests, electronic health records, and whole-genome sequence data – to predict the spread of an AMR infection in the hospital setting. In the study, the researchers used five years of real-world data from a New York City hospital. They focused on carbapenem-resistant Klebsiella pneumoniae (CRKP), an AMR bacterium with a high mortality rate. The framework draws on the four data sources to model the spread of CRKP infections, from individual to individual over time.

Levels of CRKP colonisation in healthcare facilities vary by location but can reach up to 22 percent of patients. However, hospitals do not routinely screen for CRKP, and surveillance relies on testing patients who are either symptomatic or suspected of coming into contact with symptomatic patients, overlooking asymptomatic colonisers.

“Many antimicrobial-resistant organisms colonise people without causing disease for long periods of time, during which these agents can spread unnoticed to other patients, healthcare workers, and even the general community,” says the study’s first author, Sen Pei, PhD, assistant professor of environmental health sciences at Columbia Mailman School. “Our inference framework better accounts for these hidden carriers.”

The researchers used the inference framework to estimate CRKP infection probabilities despite limited data on infections. They found that combining the four data sources led to more accurate carrier identification. Furthermore, using data simulations, they found that the framework was more successful at preventing the spread of infections after isolating carriers than traditional approaches based on an individual’s time in the hospital, the number of people they came in contact with, and/or whether the people they came in contact with were identified as having infections.

Using the inference model, isolating 1% of patients on the first day of each week (10–13 patients per week) reduces 16% of positive cases and 15% of colonisation; isolating 5% of patients on the first day of each week (50–65 patients per week) reduces 28% of positive cases and 23% of colonisation. For comparison, using contact tracing – a typical approach in clinical settings (ie, screening close contacts of positive patients) – isolating 1% of patients reduces 10% of positive cases and 8% of colonisation; isolating 5 percent of patients reduces 20% of positive cases and 16% of colonisation.

The new study builds on a study in PNAS that introduced a method that more accurately predicts the likelihood that individuals in hospital settings are colonised with methicillin-resistant Staphylococcus aureus (MRSA) than existing approaches. The new study is a significant advance over the previous study because it now includes patient-level electronic health records and whole-genome sequence data, which allows more precise identification of silent spreaders. While the inference model improves on traditional methods, it remains challenging to eliminate AMR pathogens in hospitals due to their widespread community circulation, limited hospital surveillance, and high false-negative rates in clinical culture tests. However, there is room for improvement; a future study aims to look at the spread of AMR using ultra-dense sequencing.

Source: Columbia University Mailman School of Public Health

Trial Results Shows the Value of Patient Navigation in Humanising HIV Care

Eastern Cape HIV Programme demonstrates success in resource-constrained setting

Photo by Pexels on Pixabay

A new randomised controlled trial conducted in the Eastern Cape has shown that adding structured patient navigation to same-day antiretroviral therapy (ART) can make a meaningful difference for people newly diagnosed with HIV. The trial found that patients who received support from trained navigators were far more likely to stay in care and keep their viral load low over six months. Those with navigator support had a 79% retention rate, compared with 64% under standard care.

Among patients who achieved a viral load of fewer than 50 copies per millilitre, 64% remained in care, compared to just 39% without this extra support(1). Patient navigation combines personal support, such as home or virtual check-ins and WhatsApp reminders, with practical help like linking people to services and monitoring their progress. It was especially effective for people who started treatment on the same day as their diagnosis.

“This approach humanises HIV care. It builds a bridge between the clinic and the community, helping patients stay connected to treatment and ultimately saving lives,” said lead author Siyakudumisa Nontamo, Facility Team Lead: Care & Treatment Programme at TB HIV Care.

In August 2024 the Human Sciences Research Council released findings from the Sixth South African HIV Prevalence, Incidence, and Behaviour Survey (SABSSM VI) for the Eastern Cape. The results show that HIV prevalence in the province stabilised, moving from 15.9% in 2017 to 13.7% in 2022. This is an estimated 980 000 people living with HIV, down from about 1 million in 2017. Access to treatment has improved significantly. ART coverage increased from 67.8% in 2017 to 83.5% in 2022, meaning about 723 000 people in the province are now receiving treatment. However, gaps remain among young people: only 70.9% of adolescents and youth aged 15–24 living with HIV are on ART, compared to 84.8% of adults aged 25-49. Among females, coverage is much lower for young women (68.7%) than for women aged 25-49 (88.2%). ART use also varies across districts, ranging from 69.4% in Nelson Mandela Bay to 92.0% in Alfred Nzo(2). Nationally, the proportion of people living with HIV who are currently on antiretroviral treatment (ART) rose to 80.9% in 2022, up from 63.7% in 2017.

Despite major advances in antiretroviral therapy, retention in care remains a persistent challenge within South Africa’s HIV programme, especially in rural provinces such as the Eastern Cape. Many patients initiate treatment but later disengage due to stigma, transport difficulties, and limited ongoing support. The study shows that low-cost, human-centred interventions can significantly strengthen treatment outcomes. The trial, titled “Impact of Patient Navigation on Retention in Care and HIV Viral Load Suppression Among Newly Diagnosed Persons Living with HIV in the Eastern Cape,” compared standard HIV care to an approach where trained patient navigators provided ongoing support to patients starting antiretroviral therapy (ART). Beyond improved retention and viral suppression, the trial also showed that patients supported by navigators experienced fewer deaths and dropouts, with substantially lower losses to follow-up and reduced mortality than those receiving standard care, ultimately strengthening HIV programmes(1).

Patient navigation, in particular, helps bridge the gap by pairing practical healthcare coordination with empathy and community-based follow-up. Navigators assist patients with managing appointments, maintaining adherence, and accessing psychosocial services, thereby fostering trust, continuity, and sustained engagement in care. This approach aligns with South Africa’s national HIV strategy, which prioritises differentiated, patient-centred models of care to achieve the UNAIDS 95-95-95 targets.

At scale, TB HIV Care’s programmes are grounded in person-centred, integrated service models that reflect the real lives and needs of people affected by HIV and TB. This study reinforces TB HIV Care’s belief that support beyond clinic walls is essential for achieving lasting impact. In the 2024/25 reporting period, the organisation reached more than 1.9 million people with HIV testing services and initiated 27,873 individuals on ART, achieving a 95% viral suppression rate among clients in care.

“By bridging the gap between diagnosis and ongoing care, patient navigation aligns with our outreach for key populations and our shift toward holistic service delivery. We look forward to translating this evidence into practice, ensuring fewer people fall through the cracks and more sustain treatment success”, said Professor Harry Hausler, CEO at TB HIV Care.

Additional findings from the Sixth South African HIV Prevalence, Incidence, and Behaviour Survey (SABSSM VI) for the Eastern Cape.

  • In the Eastern Cape, HIV remains most common among adults aged 25-49, with a prevalence of 27.7%, and women in this age group are especially affected at 35.4% compared to 17.1% for men.
  • The survey also found geographic differences: HIV prevalence among men was highest in urban areas (8.7%), while among women it was highest in rural informal or tribal areas (19.8%).
  • By district, prevalence was highest in Chris Hani (14.4%), Amathole (14.1%), Alfred Nzo (13.9%), and lowest in Nelson Mandela Bay (9.7%).
  • At a national level, the survey showed that 81.4% of all people living with HIV were virally suppressed. The survey found encouraging progress in the Eastern Cape, where viral load suppression (VLS) among people living with HIV rose to 79.3% in 2022, up from 66.3% in 2017. However, children aged 0-14 years had much lower suppression levels, at 61.4%. Among people aged 15-49 years living with HIV, 78.6% were virally suppressed. Within this group, women had far higher suppression rates (83.9%) than men (65.4%).

About the Randomized Controlled Trial

The randomised controlled trial involved participants from HIV testing sites in the O.R. Tambo District (Flagstaff, Mthatha Gateway, and Tsolo Clinics). It was approved by the Eastern Cape Health Research Committee and Walter Sisulu University’s Ethics Committee. The study was supported by the Chemical Industries Education and Training Authority (CHIETA) and the South African Medical Research Council’s Strategic Health Innovation Partnerships (SHIP).

References:

  1. Nontamo, S., Kamsu, G.T., Ndebia, E.J., et al. Impact of Patient Navigation on Retention in Care and HIV Viral Load Suppression Among Newly Diagnosed Persons Living with HIV in the Eastern Cape – South Africa. Access.
  2. Human Sciences Research Council. Sixth South African HIV Prevalence, Incidence, and Behaviour Survey (SABSSM VI). Access.

Study Untangles the Complex Relationship Between Cannabis and Binge Drinking

Photo by Pavel Danilyuk on Pexels

Binge drinking is most common among younger adults, and using cannabis during late adolescence or early adulthood is known to increase the risk of engaging in binge drinking. Now, new research from the Arizona State University Department of Psychology shows that this increase in risk of binge drinking from cannabis use varies with age, peaking around age 20.

“We found that during ages 18 to 20, cannabis motivates people to binge drink more often, while later in adulthood, around age 24, it motivates them to binge drink less. This dichotomy has consequences for prevention and treatment efforts,” said Jack Waddell, assistant professor of psychology at ASU and first author on the study.

The study used cannabis use and alcohol consumption data from the National Consortium on Alcohol and Neurodevelopment in Adolescence, a long-term study of over 500 participants with sites in California, Oregon, North Carolina and Pennsylvania. The work was published in Alcohol Clinical and Experimental Research.

Not just one substance

Waddell described the interaction of cannabis use and alcohol consumption as a complex relationship. 

He has previously found that individuals who use both alcohol and cannabis report higher rates of substance use disorder than those who use just one. Yet, he has also found that many individuals who use both alcohol and cannabis perceive using them together as being protective against some of the negative consequences of excessive drinking.

In the current study, he and his collaborators expected using cannabis to consistently increase the likelihood of the study participants engaging in binge drinking, not for it to flip from enabling excessive drinking in late teens and early 20s to blunting it around age 24.

“People are reducing their binge drinking but they’re switching to cannabis. This can be viewed positively from a harm-reduction standpoint, but it is important to understand that there are still a lot of risks associated with cannabis use,” Waddell said.

Digging into the dynamics of substance use

Waddell wants to understand how people end up using more than one substance, and to do this, he plans to study how people think about and use substances on a day-to-day basis.

“What is it that motivates the transition from using one substance to more than one? Is it someone’s affective experiences – their emotions and moods – whenever they’re using alcohol or cannabis that makes them want to add the other? Is it the social environment?” he asked.

Going forward, Waddell plans to use technology-enhanced momentary assessments, which are questionnaires or check-ins delivered by push notification on an app or text message, to study people’s behavior in the moment. 

Having a finer-grained level of access to how different kinds of substance use interact with and influence each other will lead to better treatment and prevention strategies.

No Increased Safety Risk for Obese Patients Undergoing Shoulder Replacement Surgery

Underweight patients may face higher risk of poor outcomes after surgery

Source: Pixabay CC0

Higher BMI is not linked to increased risk of death or other complications following shoulder replacement surgery, according to a new study by Epaminondas Markos Valsamis from the University of Oxford, UK, and colleagues publishing November 20th in the open-access journal PLOS Medicine.

Joint replacement surgeries – including hip, knee and shoulder replacements – can significantly improve quality of life. Many patients with obesity are denied these procedures despite a lack of formal recommendations from national organisations. Evidence on the risks of joint replacement surgery in patients with obesity is limited and mixed.

In this study, researchers analysed more than 20 000 elective shoulder replacement surgeries performed across the UK and Denmark to see whether BMI was associated with death or other complications.

Compared to patients with a healthy BMI (21.75 kg/m2), patients with obesity (BMI 40 kg/m2) had a 60% lower risk of death within the year following surgery. Those considered underweight (BMI <18.5 kg/m2) had a slightly higher risk of death. The study does not support restricting patients with a high BMI from having elective shoulder replacement surgery, contrary to evidence that some hospitals are starting to restrict patients.

One main limitation of this study was the small sample size of the underweight population (131 for the UK data, 70 for the Denmark data). However, this was a large study that consistently showed a lower risk of death and complications in patients with obesity undergoing shoulder replacement surgery across multiple outcomes and two countries. The results can help patients, surgeons, and policymakers make informed decisions about who should be considered fit for these surgeries.

Lead author Epaminondas Markos Valsamis says, “Shoulder replacements offer patients the opportunity for excellent pain relief and improved quality of life. Our research shows that patients with a higher BMI do not have poorer outcomes after shoulder replacement surgery.”

Senior author Professor Jonathan Rees adds, “While BMI thresholds have been used to limit access to joint replacement surgery, our findings do not support restricting higher BMI patients from accessing shoulder replacement surgery.”

Provided by PLOS

New Metric Better Predicts Which Drug-induced Liver Injury Patients Need Transplant

Patients who took herbal or dietary supplements found to have lowest likelihood of survival

Photo by Myriam Zilles on Unsplash

A newly developed tool, called the DILI-Inpt prognostic score, can predict patients with drug-induced liver injury who are unlikely to survive without a liver transplant.

In study results published in Clinical Gastroenterology and Hepatology, the DILI-Inpt prognostic score outperformed existing systems in identifying which hospitalised patients with severe idiosyncratic drug-induced liver injury were unlikely to recover on their own.

“We have struggled for many years to identify which patients with severe DILI may need to be evaluated for emergency liver transplantation, versus recovery with supportive care,” said Robert Fontana, MD, Michigan Medicine hepatologist, professor of internal medicine and the study’s senior author.

“The stakes are high. And it is made even more a difficult due to the small number of prior cases we have seen. This study provides important data for all of us to use and help manage our patients.”

The acronym DILI refers to idiosyncratic drug-induced liver injury, an uncommon condition caused by a variety of drugs and herbal and dietary supplements.

While most patients who experience such liver injuries recover after discontinuation of the culprit drugs, some advance to acute liver failure and may require liver transplantation.

The DILI-Inpt prognostic score aims to better assess such patients so that they can be more quickly sent to a liver transplant centre or placed on the waiting list.

This study used data from 305 adults from 1998 to 2019, enrolled in a national database of acute liver failure and acute liver injury patients. The drugs that induced liver injuries in these patients varied and included antimicrobials (42.6%), herbal-dietary supplements (16%) and psychoactive drugs (9.8%).

After 21 days, 110 patients (36%) spontaneously survived – ie, recovered on their own after discontinuing the drug – while 115 required liver transplant and 80 died. For these 305 patients, a variety of tests results were analysed, including total bilirubin, serum ALT and creatinine values.

Using multivariable logistic regression modeling, DILI-Inpt prognostic score was developed to predict which patients were mostly likely to require liver transplant and at highest risk of death. The Area Under the Receiver Operating Characteristic Curve for DILI-Inpt prognostic score was 0.86 and significantly higher than that of MELD (0.79 AUROC score) and King’s College Criteria (0.63).

These results suggest that the DILI-Inpt prognostic score, which is composed of two readily available blood tests (total bilirubin and INR values) and two clinical parameters (encephalopathy grade and use of herbal products), better predicts which patients will not spontaneously survive than these existing scoring systems.

Of note, the diagnosis of drug-induced liver injury is frequently delayed or missed by the need to exclude more common causes of liver injury and its low incidence.

Since DILI patients have a low likelihood of recovery, there is an urgent need to quickly identify which patients might require liver transplant.

“Another important finding in our study was that patients with herbal and dietary supplement hepatotoxicity had the lowest likelihood of survival and that the proportion of herbal cases was increasing over time in the United States,” Fontana said.

“Our data indicates that further research as to why and how botanical products may lead to potentially severe liver injury in otherwise healthy people is needed.”

Source: University of Michigan

Study Reveals the Dual Role for a Protein Critical for Healing Nerve Damage

Sarm1 appears to be essential for regeneration

Source: CC0

Nerve damage can be an unfortunate side effect from an accident, illness or even certain treatments, like chemotherapy. Fortunately, the peripheral nervous system can heal itself to a certain extent, albeit very slowly. Researchers are still trying to understand this natural healing process in order to improve it. A recent study published in Science Translational Medicine sheds new light on this.

This mouse-based study from the University of Michigan adds to the evidence regarding a specific protein inside of the nerves, called Sarm1, that seems key for regeneration. Previous studies have revealed that when Sarm1 is activated, it sets off the degenerative process in nerves. The thinking has been that for conditions like chemotherapy induced peripheral neuropathy, diabetes, or nerve trauma, blocking Sarm1 would beneficially block the breakdown of nerves.

But what else would blocking Sarm1 effect?

“We know that nerve breakdown after an injury is quite efficient, and the breakdown is what Sarm1 controls. So, there must be a biological reason for this breakdown to be so quick and efficient,” said Ligia B. Schmitd, PhD, of the Department of Cell and Developmental Biology, lead author of the study.

Schmidt is a research fellow in the lab of Roman Giger, PhD, co-senior author with Ashley Kalinski of the University of South Carolina.

Using mice bred to lack Sarm1 and subjecting them to peripheral nerve injury, the team could observe drastic changes to the distal nerve environment, including fewer blood-borne immune cells resulting in reduced nerve inflammation.

“These cells are important because they have to enter the injured nerve to clean up all of the debris,” said Schmitd.

More importantly, their study revealed a critical effect on Schwann cells, which line and support the peripheral nerves.

Normally following an injury, Schwann cells will convert to a repair state in which they express different genes and proteins to migrate and proliferate in order to regrow the axon, the long projecting portion of the neuron.

But without Sarm1, “the Schwann cells are just stuck there,” said Schmitd.

In essence, Sarm1 controls both nerve degeneration and regeneration through its effect on Schwann cells.

The team also noted that a lack of Sarm1 seemed to boost the nerve’s efforts to regrow, but without activating the repair Schwann cells, these efforts were much less efficient.

“For a long time, we’ve thought that simply preventing nerve breakdown would be a good thing. What our study now shows is that this early breakdown also sends powerful signals to Schwann cells and immune cells that are needed for efficient repair, so any future therapy that targets Sarm1 will have to preserve that delicate balance between protection and regeneration,” said Giger, professor in the Department of Cell and Developmental Biology.

Schmitd notes that the study needs to be done in other animal models and with other proteins involved in nerve repair, “but if this proves to be an important mechanism for triggering the repair Schwann cell state, then down the road, fixing this response could help humans regenerate peripheral nerves.”

Source: University of Michigan Medicine

Study Links Food Insecurity to Tumour Growth in Paediatric Neuroblastoma

How food insecurity may biologically intensify neuroblastoma growth, bridging social determinants of health and cancer biology 

Image Credit: Justine Ross, Michigan Medicine

Neuroblastoma remains one of the deadliest childhood malignancies, accounting for a disproportionate number of paediatric cancer deaths worldwide.

Despite major therapeutic advances, survival rates remain lower for children from socioeconomically disadvantaged families, a pattern long observed and poorly understood at the biological level.

Extending earlier National Institute of Health’s Children’s Oncology Group findings that linked poverty to poorer survival in paediatric cancers, investigators at University of Michigan Health C.S. Mott Children’s Hospital set out to develop the first experimental model to test how social determinants might influence tumour biology itself.

The team led by Erika Newman, MD, Section Head of Pediatric Surgery and Associate Director for Health Equity at the Rogel Comprehensive Cancer Center developed an innovative murine cancer model that simulated food insecurity by intermittently varying chow access, mirroring the unpredictable nutrition many families experience.

The study, recently published in Communications Biology, used established neuroblastoma validated xenograft models to observe how this stressor affected tumour growth and biologic responses.

The results were striking: the experimental group exposed to food insecurity developed significantly larger and bulkier tumours, accompanied by persistent elevation of stress hormones (corticosterone) and activation of tumour survival pathways.

“Our work builds on decades of clinical evidence linking poverty and food insecurity to poorer cancer outcomes,” said Newman.

“We set out to define the biology behind those disparities, to show how social conditions can become embedded in the body and influence how tumours grow.”

The work provides a translational framework linking social determinants of health to molecular pathways of cancer progression, paving the future for studies that explore how interventions addressing nutrition and stress might improve treatment response.

“This model gives us a scientific bridge between social context and cancer biology,” stated Newman.

“It shows that the environments our patients live in, access to food, stability, and safety are not background conditions. They are part of the biology we must confront if we want equitable outcomes.”

The research arrives at a moment of renewed concern over federal nutrition programs, with potential SNAP benefit interruptions amid government budget negotiations.

Newman emphasises that these findings reinforce the urgency of policies ensuring consistent food access for vulnerable children and families.

Newman stresses that health care must account for the realities in which families live.

She calls for systematic screening of social determinants like food insecurity and economic strain within paediatric and oncology practices, ensuring that medical care addresses both biologic and social drivers of outcome disparities.

Source: University of Michigan Medicine