Author: ModernMedia

Study Identifies Risk Factors in Heavy Drinkers for Advanced Liver Disease

Photo by Pavel Danilyuk on Pexels

A new study finds that heavy drinkers with either diabetes, high blood pressure or a high waist circumference are as much as twice as likely to develop advanced liver disease.

The answer may lie in three common underlying medical conditions, according to a new study published in Clinical Gastroenterology and Hepatology from Keck Medicine of USC. The research found that heavy drinkers with either diabetes, high blood pressure or a high waist circumference are as much as 2.4 times more likely to develop advanced liver disease.  

“The results identify a very high-risk segment of the population prone to liver disease and suggest that preexisting health issues may have a large impact on how alcohol affects the liver,” said Brian P. Lee, MD, MAS, a hepatologist and liver transplant specialist with Keck Medicine and principal investigator of the study. 

Diabetes, high blood pressure and a high waist circumference (89cm for women; 101cm for men), which is associated with obesity, belong to a cluster of five health conditions that influence an individual’s risk for heart attack and stroke known as cardiometabolic risk factors.  

Cardiometabolic risk factors have been linked to the buildup of fat in the liver (also known as metabolic dysfunction-associated steatotic liver disease), which can lead to fibrosis, or scarring of the liver. These risk factors affect more than one in three Americans, and cardiometabolic health has been worsening among the population, especially among those under 35, according to Lee.  

Alcohol also causes fat buildup in the liver, and alcohol consumption has been on the rise since the COVID-19 pandemic, said Lee. Due to the prevalence of both cardiometabolic risk factors and drinking in the United States, Lee and his fellow researchers undertook the study to investigate which cardiometabolic risk factors predisposed the liver to damage from alcohol. 

They analysed data from the National Health and Nutrition Examination Survey, a large national survey of more than 40 000 participants, looking at the intersection of heavy drinking, individual cardiometabolic risk factors and the incidences of significant liver fibrosis. Significant liver fibrosis refers to liver scarring that can lead to liver failure.  

For the study, heavy drinking was characterised as 1.5 drinks a day for women (20 grams) and two drinks a day for men (30 grams). 

Researchers discovered that heavy drinkers with either diabetes or a high waist circumference were 2.4 times more likely to develop advanced liver disease and those with high blood pressure 1.8 times more likely. They found that the other two cardiometabolic risk factors – high triglycerides and low HDL (high-density lipoprotein) had less significant correlations to liver disease.  

While the study did not analyse why these three cardiometabolic risk factors are more dangerous for the liver, Lee speculates that these conditions share a common pathway to fat buildup in the liver that when combined with extra fat deposits in the liver from excessive alcohol, can cause significant damage.  

Lee stresses that the study does not imply it is safe for those without these three cardiometabolic risks to consume large amounts of alcohol. “We know that alcohol is toxic to the liver and all heavy drinkers are at risk for advanced liver disease,” he said.  

Lee hopes that the study results will encourage people to consider their individual health and risk profile when making decisions about alcohol consumption. He would also like to see practitioners offer more personalized health screenings and interventions for those who drink with cardiometabolic risk factors so that liver damage among this high-risk group can be caught early and treated.  

Source: University of Southern California – Health Sciences

Progress and Challenges in the Development of Brain Implants

Pixabay CC0

In a paper recently published in The Lancet Digital Health, a scientific team led by Stanisa Raspopovic from MedUni Vienna looks at the progress and challenges in the research and development of brain implants. New achievements in the field of this technology are seen as a source of hope for many patients with neurological disorders and have been making headlines recently. As neural implants have an effect not only on a physical but also on a psychological level, researchers are calling for particular ethical and scientific care when conducting clinical trials.

The research and development of neuroprostheses has entered a phase in which experiments on animal models are being followed by tests on humans. Only recently, reports of a paraplegic patient in the USA who was implanted with a brain chip as part of a clinical trial caused a stir. With the help of the implant, the man can control his wheelchair, operate the keyboard on his computer and use the cursor in such a way that he can even play chess. About a month after the implantation, however, the patient realised that the precision of the cursor control was decreasing and the time between his thoughts and the computer actions was delayed.

“The problem could be partially, but not completely, resolved – and illustrates just one of the potential challenges for research into this technology,” explains study author Stanisa Raspopovic from MedUni Vienna’s Center for Medical Physics and Biomedical Engineering, who published the paper together with Marcello Ienca (Technical University of Munich) and Giacomo Valle (ETH Zurich). “The questions of who will take care of the technical maintenance after the end of the study and whether the device will still be available to the patient at all after the study has been cancelled or completed are among the many aspects that need to be clarified in advance in neuroprosthesis research and development, which is predominantly industry-led.”

Protection of highly sensitive data

Neuroprostheses establish a direct connection between the nervous system and external devices and are considered a promising approach in the treatment of neurological impairments such as paraplegia, chronic pain, Parkinson’s disease and epilepsy. The implants can restore mobility, alleviate pain or improve sensory functions. However, as they form an interface to the human nervous system, they also have an effect on a psychological level: “They can influence consciousness, cognition and affective states and even free will. This means that conventional approaches to safety and efficacy assessment, such as those used in clinical drug trials, are not suitable for researching these complex systems. New models are needed to comprehensively evaluate the subjective patient experience and protect the psychological privacy of the test subjects,” Raspopovic points out.

The special technological features of neuroimplants, in particular the ability to collect and process neuronal data, pose further challenges for clinical validation and ethical oversight. Neural data is considered particularly sensitive and requires an even higher level of protection than other health information. Unsecured data transmission, inadequate data protection guidelines and the risk of hacker attacks are just some of the potential vulnerabilities that require special precautions in this context. “The use of neural implants cannot be reduced to medical risks,” summarises Stanisa Raspopovic. “We are only in the initial phase of clinical studies on these technological innovations. But questions of ethical and scientific diligence in dealing with this highly sensitive topic should be clarified now and not only after problems have arisen in test subjects or patients.”

Source: Medical University of Vienna

Link between Early and Long-term Side Effects from Prostate Cancer Radiotherapy

Credit: Darryl Leja National Human Genome Research Institute National Institutes Of Health

Men undergoing radiation therapy for prostate cancer who experience side effects early in treatment may face a higher risk of developing more serious long-term urinary and bowel health issues, according to a new study led by investigators from the UCLA Health Jonsson Comprehensive Cancer Center.

The study found that patients who experienced moderate acute urinary side effects in the first three months after radiation were nearly twice as likely to develop late urinary complications years later compared to those without early symptoms. Similarly, patients with early bowel side effects had nearly double the risk of chronic bowel issues.

The findings, published in The Lancet Oncology, highlight the importance of developing strategies to better manage acute toxicities to help improve long-term outcomes and quality of life for patients.

“Men with prostate cancer are living longer than ever, and our goal is to reduce the risk of late toxicities, such as difficulty urinating or rectal bleeding, that can impact a patient’s quality of life for years,” said Dr Amar Kishan, executive vice chair of radiation oncology and senior author of the study. “This study highlights innovations we’re developing, such as using smaller treatment margins in prostate radiation to minimize early side effects, that can lead to lasting benefits by also reducing the risk of long-term complications for patients.”

Radiation therapy is often a key treatment for localised prostate cancer, often involving higher doses to better control the disease. While this approach effectively controls cancer, it can also harm nearby healthy tissues, causing acute and late-term side effects.

Acute toxicity refers to side effects that occur during treatment or within the first three months after it ends, and they are typically temporary. Common urinary side effects include increased frequency of urination, difficulty urinating and discomfort during urination. Bowel-related side effects may include softer stools or diarrhea, as well as rectal discomfort during bowel movements.

Late toxicity, on the other hand, can appear months or even years later and can last for years. Late urinary toxicities include narrowing of the urethra and having blood in the urine. Late bowel toxicities include having blood in the stool or having an ulcer in the wall of the rectum. These issues often can have a bigger impact on a person’s quality of life compared to acute side effects.

While both acute and late toxicities are caused by radiation’s effect on healthy tissues, the connection between the two hasn’t been well-studied, particularly using large-scale data. 

To better understand this relationship, the researchers analysed data from over 6500 patients from six randomised phase 3 clinical trials that shared detailed, individual-level data on short-term and long-term side effects affecting the urinary and bowel systems.

The researchers found patients with moderate or worse early side effects were more likely to experience severe late effects, even years after treatment. Men with early urinary or bowel issues were also more likely to report significant drops in their ability to manage daily activities and overall quality of life.

For urinary toxicity, experiencing acute toxicity increased the rate of late toxicity from 7.5% to 12.5%, and for bowel toxicity, experiencing acute toxicity increased the rate of late toxicity from 12.7% to 22.5%.

The odds of having a clinically-significant decline in urinary quality of life were 1.4 times as high for men who had moderate acute urinary toxicity. The odds of having a clinically-significant decline in bowel quality of life were 1.5 times as high for men who had moderate acute bowel toxicity.

“These results show that acute toxicities following prostate radiotherapy are associated with late toxicities months and years later,” said first author Dr John Nikitas, oncology resident at UCLA Health. “This underscores the importance of measures that reduce the risk of acute toxicities because they may also potentially improve long-term outcomes and quality of life for patients.”

Kishan emphasised the potential impact of newer techniques to reduce both acute and late toxicities:

“Reducing early side effects through advanced techniques like MRI-guided radiation, which allows for more precise targeting of tumours, and urethral-sparing methods, which uses spacers between the prostate to protect surrounding tissues and rectum could potentially help lower the risk of lasting side effects.”

However, more studies are needed to determine if specific strategies to reduce early side effects will improve long-term outcomes and whether treating short-term side effects early can help prevent long-term complications.

Source: University of California – Los Angeles Health Sciences

UTI Pain Stems from Hypersensitivity in Nerves for Bladder Fullness

Photo by Jan Antonin Kolar on Unsplash

New insights into what causes the painful and disruptive symptoms of urinary tract infections (UTIs) could offer hope for improved treatment. Nearly one in three women will experience UTIs before the age of 24, and many elderly people and those with bladder issues from spinal cord injuries can experience multiple UTI’s in a single year.

Findings from a new study led by Flinders University’s Dr Luke Grundy and SAHMRI’s Dr Steven Taylor show that UTIs cause the nerves in the bladder to become hypersensitive resulting in the extremely painful and frequent urge to urinate, pelvic pain, and burning pain while urinating.

“We found that UTIs, caused by bacterial infections such as E. coli, can significantly alter the function and sensitivity of the nerves that usually detect bladder fulness, a phenomenon known as ‘bladder afferent hypersensitivity’, says Dr Grundy, from the College of Medicine and Public Health.

“The study was the first of its kind to explore the impact of UTIs on the sensory signals that travel from the bladder to the brain, and the direct link this response has to causing bladder pain and dysfunction.”

A normal bladder will expand to store urine and can store up to two cups of urine for several hours.  Once full, the bladders nervous system will signal that it is time to urinate, or empty the bladder.

Described in Brain, Behavior, & Immunity – Health, researchers analysed how UTIs cause sensory nerves that respond to bladder distension to become hypersensitive, so that they send signals of bladder fulness, even when the bladder is not yet full.

“Our findings show that UTIs cause the nerves in the bladder to become overly sensitive, which means that even when the bladder is only partly filled, it can trigger painful bladder sensations that would signal for the need to urinate,” he says.

“We think that these heightened sensory responses may serve as a protective mechanism, alerting the body to the infection and prompting more frequent urination to expel the bacteria.”

Building on previous research, the new study reveals a deeper understanding of how UTIs affect bladder function and the nervous system, and raises important questions about the role of bladder hypersensitivity in the development of UTI-related symptoms.

“Our findings go further in identifying the significant changes that occur during UTIs and provide a clearer picture of the mechanisms behind the painful and disruptive bladder sensations often associated with these infections,” says Dr Grundy.

The study also suggests that better understanding and targeting of bladder afferent hypersensitivity could improve treatment options for patients suffering from recurrent UTIs or other bladder conditions where sensory dysfunction plays a role.

“Theoretically we should be able to find a way to address hypersensitive nerves in the bladder and reduce or eliminate the painful and debilitating symptoms of a UTI,” he adds. This would improve quality of life whilst antibiotics are taking care of the infection.

Researchers are striving to address the limited treatments available for bladder pain by exploring how the findings may translate into clinical practice and improve the management of UTIs in patients.

Source: Flinders University

What are the Best Methods to Treat Rotator-cuff Tears?

Photo by Kampus Production

Rotator-cuff disorders are the most common cause of shoulder symptoms. Tears of the rotator cuff can result from a substantial traumatic injury or can occur slowly over time. Most degenerative tears occur in the dominant arm of adults over the age of 40, and their prevalence increases with as you age. There are a variety of treatments.

In a recent publication in the New England Journal of Medicine, University of Michigan Health professor and chair of Physical Medicine and Rehabilitation, Nitin Jain, MD, MSPH, talks through the different treatments available for rotator-cuff tears to help bring together a better understanding of how to treat the issue for patients and providers.

Nonoperative treatment, such physical therapy, is the typical approach to treating rotator-cuff tears. However, surgery is considered in certain patients whose rotator-cuff tears don’t resolve with nonoperative treatments. Topical treatments also exist, such as glyceryl trinitrate (nitro-glycerine) gel which appears to have the greatest success.

Symptoms of rotator-cuff tears

“Patients with nontraumatic or degenerative rotator-cuff tears typically experience an onset of shoulder pain that seems to have no cause,” said Jain. 

“However, it is not uncommon for tears to be asymptomatic and become slowly painful over time, or even cause no pain at all.”

Jain says there are some activities that make the injury more painful as the tear worsens. This can include sleeping on your shoulder, overhead activities and/or lifting items above your shoulder level.

“Rotator-cuff tears may also grow over time, but there’s a lack of correlation between patient symptoms and the size and thickness of the tear,” explained Jain.

Your active range of motion and arm strength are usually affected by the tear, which gets assessed by using certain protocols when providers are searching for a diagnosis.

Rehabilitation and physical therapy for rotator-cuff tears 

This is the most common form of treatment for rotator-cuff tears, says Jain: “It is recommended that as the first line of specialist referral, patients seek care from a physical medicine and rehabilitation doctor (physiatrist) or sports medicine doctor.

“Rehabilitation and physical therapy routines address areas such as periscapular muscle weakness, correcting scapular posture and improve rotator cuff muscle strength and endurance.”

In observational studies, more than 80% of patients who received supervised physical therapy reported reduced pain and improved function between 6 months to a year. However, the trial populations consisted of patients with various types of rotator-cuff injuries and had no requirement for advanced imaging to confirm their diagnosis.

“One of the biggest factors in a successful rehabilitation was trust from patients that their physical therapy routine would improve their rotator-cuff condition,” said Jain. “The more patients leaned into the physical therapy routine, the better their outcomes were.”

Other nonpharmacologic therapies for rotator-cuff tears

Evidence suggests that psychosocial distress and depression are associated with shoulder pain and reduced function in patients with rotator-cuff tears.

“Despite this, though, there isn’t much data supporting psychosocial interventions in the treatment of rotator-cuff disorders, even though they show benefit in the treatment of other musculoskeletal disorders such as lower back pain,” said Jain.

In addition to the lack of data for psychological interventions for rotator-cuff repairs, there’s also a lack of high quality trials supporting the use of manual therapy, massage therapy, acupuncture, therapeutic ultrasonography, transcutaneous electrical nerve stimulation, shock-wave therapy or pulsed-electromagnetic-field therapy.

Topical and oral medications and injections for rotator-cuff tears

There isn’t a lot of evidence supporting the use of topical medications in treating rotator-cuff disorders. The topical treatment with the best outcomes so far has been glyceryl trinitrate.

In a small, randomised trial it showed short term benefits in the treatment of rotator-cuff disorders, but it also found there was a considerably high bias towards this treatment from participants in the study.

Topical nonsteroidal anti-inflammatory drugs such as diclofenac and ketoprofen have also been effective in providing pain relief in chronic musculoskeletal pain and tendinitis and have a better safety profile than oral, nonsteroidal anti-inflammatory drugs.

“But high quality evidence supporting their use in rotator-cuff disorders is still lacking,” explained Jain.

For oral medications, randomised research trials have shown that oral nonsteroidal anti-inflammatory drugs (NSAIDs) reduced pain, although modestly, in patients with rotator-cuff disorders. 

“Opioid drugs are generally not recommended due to risks associated with their use and lack of evidence of superiority to nonopioid therapy in a variety of musculoskeletal conditions,” said Jain.

Jain says acetaminophen hasn’t been studied specifically in rotator-cuff disorders, but what has been studied has shown little or no benefit regarding pain or function.

“Rigorous evidence is lacking to inform the use of pain-modulating drugs such as gabapentin, duloxetine, and pregabalin, specifically regarding the nonoperative treatment of rotator-cuff disorders,” said Jain.

Injection of a glucocorticoid, together with a local anaesthetic, has been reported to provide symptomatic pain relief in patients with rotator-cuff disorders.

Small trials have shown short term benefit, about four weeks long, of pain relief through using this method. The injections are performed in the subacromial space of the rotator-cuff for those with subacromial impingement syndrome. Some centres use ultrasound guidance to administer this treatment, which can reduce the risk of an inadvertent injection into the tendon.

Surgical interventions for rotator-cuff injuries

“Surgical interventions are not the initial recommendation when it comes to rotator-cuff repairs. However, they may be considered in some patients whose condition does not improve with conservative treatment,” said Jain.

Observational data supports that surgery is associated with better function and reduced pain in patients who are under 65 years of age and have smaller tears.

Surgical repairs are mostly performed arthroscopically, involving the repair of the torn tendon and resecuring it to the humerus to allow for tendon-to-bone healing as well as a low incidence of complications, explains Jain.

The hypothesis that surgical intervention can reduce the progression of muscle degradation has led some experts to recommend early surgical intervention, but data is still lacking on outcomes of early surgery compared to surgery later.

Source: Michigan Medicine – University of Michigan

Could the Contraceptive Pill Reduce the Risk of Ovarian Cancer?

Photo by Reproductive Health Supplies Coalition on Unsplash

It’s a little pill with big responsibilities. But despite its primary role to prevent pregnancy, the contraceptive pill (or ‘the Pill’) could also help reduce the risk of ovarian cancer, according to new research from the University of South Australia.

Screening for risk factors of ovarian cancer using artificial intelligence, UniSA researchers found that the oral contraceptive pill reduced the risk of ovarian cancer by 26% among women who had ever used the Pill, and by 43% for women who had used the Pill after the age of 45.

The study, published in the International Journal of Gynecologic Cancer, also identified some biomarkers associated with ovarian cancer risk, including several characteristics of red blood cells and certain liver enzymes in the blood, with lower body weight and shorter stature associating with a lower risk of ovarian cancer.

Researchers also found that women who had given birth to two or more children had a 39% reduced risk of developing ovarian cancer compared to those who had not had children.

UniSA researcher Dr Amanda Lumsden says understanding risks and preventative factors for ovarian cancer is key for improved treatment and outcomes.

“Ovarian cancer is notoriously diagnosed at a late stage, with about 70% of cases only identified when they are significantly advanced,” Dr Lumsden says.

“Late detection contributes to a survival rate of less than 30% over five years, in comparison to more than 90% for ovarian cancers that are caught early. That’s why it’s so important to identify risk factors.

“In this research, we found that women who had used the oral contraceptive pill had a lower risk of ovarian cancer. And those who had last used the Pill in their mid-40s, had an even lower level of risk.

“This poses the question as to whether interventions that reduce the number of ovulations could be used as a potential target for prevention strategies for ovarian cancer.”

Supported by the MRFF, the study used artificial intelligence to assess the data of 221 732 females (aged 37-73 at baseline) in the UK Biobank.

Machine learning specialist, UniSA’s Dr Iqbal Madakkatel, says the study shows how AI can help to identify risk factors that may otherwise have gone undetected.

“We included information from almost 3000 diverse characteristics related to health, medication use, diet and lifestyle, physical measures, metabolic, and hormonal factors, each measured at the start of the study,” Dr Madakkatel says.

Source: University of South Australia

Review of Research Finds No Link between Sickle Cell Trait and Sudden Death

Expert panel’s findings refute attribution of sudden death to sickle cell trait

Photo by National Cancer Institute on Unsplash

A systematic literature review found no evidence to support that physical exertion without rhabdomyolysis (muscle breakdown) or heat injury can cause sudden death for individuals with sickle cell trait (SCT), nor is there any high-level evidence that SCT causes acute pain crises. These results were published in the American Society of Hematology’s flagship journal, Blood, and informed the Society’s updated position statement on SCT.

“SCT has long been misunderstood, fuelling widespread misinformation and medically inaccurate claims that it can lead to sudden death. This misconception has been especially prominent in cases of Black men with SCT,” said Belinda Avalos, MD, ASH president. “In light of the pervasive, widely publicized, and harmful nature of this myth, the Society aims to further promote accurate information to protect and empower affected communities.”

Individuals with SCT have one copy of the gene associated with sickle cell disease (SCD). SCD is a blood disorder characterised by misshapen blood cells that can cause blockages, leading to infections and episodes of severe pain, often referred to as acute pain crises. Unlike SCD, SCT – which affects over 100 million people worldwide, including 8 to 10% of Black Americans – is not a disease. Individuals with SCT do not go on to develop SCD and generally do not experience any related health complications.  

“To date, this is the most authoritative and definitive systematic review on this subject,” said study author Michael R. DeBaun, MD, MPH, professor of pediatrics and medicine at Vanderbilt University School of Medicine and founder and director of the Vanderbilt-Meharry Sickle Cell Disease Center of Excellence. “This review shows that any primary, secondary, or tertiary cause of death attributable to SCT is not a diagnosis substantiated by the medical evidence.”

ASH convened an expert panel of hematologists and forensic pathologists to systematically review all existing available research to answer two primary questions: 1) Do uncomplicated acute pain crises occur in people with SCT? and 2) Can physical activity above baseline result in sudden death among those individuals?

The experts conducted a multi-database search for English-language studies on SCT and pain crises or mortality, identifying 1474 such citations. Only seven of those studies reported original data, included laboratory testing for SCT in individuals, and addressed the two primary research questions.

Of these studies, none assessed acute pain crises in individuals with SCT compared to those with SCD and only one described death in individuals reported to have SCT. This study of active-duty U.S. soldiers found only that SCT was associated with a higher risk of heat-related-exertional rhabdomyolysis, or muscle breakdown, but not a higher risk of death from any cause. After the implementation of precautions to prevent heat and environmental-related injury in military personnel, the race-adjusted risk of death was no different in individuals with SCT compared to individuals without SCT.

“In the absence of two medical conditions that we are all at risk for, exertional rhabdomyolysis or crush injuries leading to rhabdomyolysis, individuals with SCT are not susceptible to sudden death. Even under these extreme environmental conditions, unexplained sudden death cannot be attributed to SCT,” said Dr. DeBaun. Taken together, these findings demonstrate that “in individuals with SCT, the likelihood of SCT alone or pain crises being the root cause of sudden death is medically impossible,” he added.

While conducting this systematic review, the experts found several studies in which the presence of sickled blood cells at autopsy was cited as evidence of death by acute pain crisis in individuals with SCT. However, the experts did not find any studies that had human data to support this hypothesis, nor any clinical descriptions sufficient to make a diagnosis of an acute pain crisis immediately preceding death.

“Medicine, even in the post-mortem setting, is science,” said corresponding study author Lachelle D. Weeks, MD, PhD, assistant professor of medicine at Harvard Medical School and physician-scientist in the division of population sciences at Dana-Farber Cancer Institute. “Our diagnoses have to make sense and be backed by medical evidence. Given the findings of this study, we owe it to individuals with SCT to ensure that post-mortem examinations check for evidence of rhabdomyolysis and other medical or traumatic causes of death.”

The review had some limitations, most notably a lack of high quality, peer-reviewed direct evidence. To help mitigate this challenge, panel members were encouraged to consider indirect evidence when reviewing abstracts and judged evidence certainty following the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) framework. However, given this paucity of data, the experts hope this review prompts additional SCT research.

Following the results of this study, ASH revised its position statement on SCT, which states that listing “sickle cell crisis” or “sickle cell trait” as a cause of death on an autopsy report for an individual with sickle cell trait is medically inaccurate and without medical evidence of causation. To read the updated statement and learn more about ASH’s advocacy efforts in this area, visit https://hematology.org/advocacy.

Source: American Society of Hematology

Why Antibiotics can Fail Even against Non-resistant Bacteria

Drug-resistant Salmonella. Credit: CDC

Antibiotics are indispensable for treating bacterial infections. But why are they sometimes ineffective, even when the bacteria are not resistant? In their latest study published in the journal Nature, researchers from the University of Basel challenge the conventional view that a small subset of particularly resilient bacteria are responsible for the failure of antibiotic therapies.

In certain infectious diseases caused by bacteria, antibiotics are less effective than expected. One example is infections caused by Salmonella bacteria, which can lead to illnesses such as typhoid fever. For many years, researchers believed that a small subset of dormant bacteria are the main problem in fighting infections. These so-called persisters can survive antibiotic treatment and cause relapses later. Researchers worldwide have been working on new therapies aimed at targeting and eliminating these “sleeping” bacteria.

In a new study, Professor Dirk Bumann’s team from the Biozentrum of the University of Basel challenges the prevailing concept that persisters are the cause of antibiotic ineffectiveness. “Contrary to widespread belief, antibiotic failure is not caused by a small subset of persisters. In fact, the majority of Salmonella in infected tissues are difficult to kill,” explains Bumann. “We have been able to demonstrate that standard laboratory tests of antimicrobial clearance produce misleading results, giving a false impression of a small group of particularly resilient persisters.”

Nutrient starvation increases Salmonella resilience

The researchers investigated antimicrobial clearance in both Salmonella-infected mice and tissue-mimicking laboratory models. The body’s defense mechanisms against bacteria often include reducing the availability of nutrients. The researchers have now revealed that in fact, this nutrient starvation is the main reason for Salmonella bacteria surviving treatments with antibiotics. The researchers assume that the same applies to other bacterial pathogens.

“Under nutrient-scarce conditions, bacteria grow very slowly,” says Bumann. “This may seem good at first, but is actually a problem because most antibiotics only gradually kill slowly growing bacteria.” As a result, the drugs are much less effective, and relapses can occur even after prolonged therapy.

Real-time analyses reveal misconception

The scientists used an innovative method to monitor antibiotic action in single bacteria in real time. “We demonstrated that nearly the entire Salmonella population survives antibiotic treatment for extended periods, not just a small subset of hyper-resilient persisters,” says first author Dr Joseph Fanous.

A major issue with the standard methods used worldwide for decades is their indirect and delayed measurement of bacterial survival, leading to distorted results. “Traditional tests underestimate the number of surviving bacteria,” explains Fanous. “And they falsely suggest the presence of hyper-resilient subsets of persisters that do not actually exist.” This misinterpretation has influenced research for many years.

Novel tools for antibiotics research

These findings could fundamentally change antibiotics research. “Our work underlines the importance of studying bacterial behaviour and antibiotic effects live and under physiologically relevant conditions,” emphasises Bumann. “In a few years, modern methods like real-time single-cell analysis will hopefully become standard.” Shifting the focus from persisters to the impact of nutrient starvation is an important step toward more effective therapies against difficult-to-treat infections.

The project is part of the National Center of Competence in Research (NCCR) “AntiResist”. The research consortium aims to develop innovative strategies to combat bacterial infections. Dirk Bumann is one of the directors of the NCCR “AntiResist”.

Source: University of Basel

Could the Key to IBS Treatment Lie in the Brain?

Irritable bowel syndrome. Credit: Scientific Animations CC4.0

Although irritable bowel syndrome (IBS) affects about a tenth of the global population, the underlying causes and mechanisms of IBS remain unclear and thus treatments focus on symptom management. At Tokyo University of Science (TUS), Japan, Professor Akiyoshi Saitoh and his research group have spent the past decade exploring this topic. This study, published in the British Journal of Pharmacology, discovered that a class of drugs called opioid delta-receptor (DOP) agonists may help alleviate IBS symptoms by targeting the central nervous system rather than acting directly on the intestine.

One of the main motivations for this study was the growing evidence linking IBS closely to psychological stress. Saitoh’s group aimed to address this potential root cause by focusing on finding a novel animal model for this condition. In a 2022 study, they developed a mice model repeatedly exposed to psychological stress – using a method called chronic vicarious social defeat stress (cVSDS) – which developed symptoms similar to a type of IBS called IBS-D. These symptoms included overly active intestines and heightened sensitivity to abdominal pain, even though their organs showed no physical damage. The cVSDS animal model involved having the subject mouse repeatedly witness a territorial, aggressive mouse defeating a cage mate, inducing indirect chronic stress.

Using the cVSDS model, the researchers sought to determine whether DOP in the brain, which is closely linked to pain and mood regulation, could serve as promising drug targets for treating stress-induced IBS. To achieve this, they performed a series of detailed experiments to observe the effects of DOP agonists on IBS symptoms and chemical signaling in the brain. Some experiments involved measuring the speed of a charcoal meal through the intestine to assess gastrointestinal motility and evaluate the impact of stress or treatments on bowel movement speed, along with directly measuring neurotransmitter concentrations using in vivo brain microdialysis. This revealed that re-exposure to VSDS increased glutamate levels in the insular cortex, but these elevated levels were normalised with DOP agonists.

According to the results, the administration of DOP agonists helped relieve abdominal pain and regulated bowel movements in cVSDS mice. Interestingly, applying the DOP agonists directly to a specific brain region called the insular cortex had similar effects on IBS symptoms as systemic treatment. “Our findings demonstrated that DOP agonists acted directly on the central nervous system to improve diarrhoea-predominant IBS symptoms in mice, and suggest that the mechanism of action involves the regulation of glutamate neurotransmission in the insular cortex,” highlights Saitoh.

Taken together, the continued research by Saitoh’s group on this topic could pave the way for effective treatments for IBS. “DOP agonists could represent a groundbreaking new IBS treatment that not only improves IBS-like symptoms but also provides anti-stress and emotional regulation effects. In the future, we would like to conduct clinical developments with the goal of expanding the indication of DOP agonists for IBS, in addition to depression,” remarks Saitoh.

Compared to currently available IBS treatments, such as laxatives, antidiarrhoeals, analgesics, and antispasmodics, targeting the underlying stress with DOP agonists may offer a more definitive solution with minimal adverse effects. Further clarification of the roles of stress and brain chemistry in the development of IBS will be essential in achieving this much-needed medical breakthrough. With promising prospects, future studies will translate Saitoh’s group’s findings to humans, bringing great relief to those affected by IBS.

Source: Tokyo University of Science

Slow Traffic Pushes Commuters to Choose Fast Food

Photo by Why Kei on Unsplash

Ever notice how much more tempting it is to pick up fast food for dinner after being stuck in traffic? It’s not just you. New research shows that traffic delays significantly increase visits to fast food restaurants, leading to unhealthier eating.

“In our analysis focusing on Los Angeles County, unexpected traffic delays beyond the usual congestion led to a 1% increase in fast food visits. That might not sound like a lot, but it’s equivalent to 1.2 million more fast food visits per year in LA County alone. We describe our results as being modest but meaningful in terms of potential for changing unhealthy food choices,” said study author Becca Taylor, assistant professor at the University of Illinois Urbana-Champaign.

Taylor and her co-authors had access to more than two years’ of daily highway traffic patterns in Los Angeles, along with data showing how many cell phone users entered fast-food restaurants in the same time period.

With these data, the team created a computational model showing a causal link between unexpected traffic slow-downs and fast food visits. This pattern held at various time scales, including 24-hour cycles and by the hour throughout a given day. When analysed by the day, traffic delays of just 30 seconds per mile were enough to spike fast-food visits by 1%.

“It might not be intuitive to imagine what a 30-second delay per mile feels like,” Taylor said. “I think of it as the difference between 10a.m. traffic and 5p.m. traffic.”

When the researchers broke the day into hour-long segments, they found a significantly greater number of fast food visits when traffic delays hit during the evening rush hour. At the same time, grocery store visits declined slightly.

“If there’s traffic between 5 and 7p.m., which happens to be right around the evening meal time, we see an increase in fast food visits,” Taylor said.

“Drivers have to make a decision about whether to go home and cook something, stop at the grocery store first, or just get fast food.”

Considering every major city has both traffic and fast food restaurants lining highway feeder roads, it’s not a stretch to extrapolate the pattern beyond Los Angeles.

Taylor and her co-authors say the link between traffic and unhealthy food choices is just one more reason policymakers around the country and the globe should prioritize infrastructure reforms to ease congestion.

“Our results contribute to the literature suggesting time constraints are really important to the food choices people make. Any policies aimed at loosening time constraints – and traffic is essentially lost time – could help battle unhealthy eating,” Taylor said. “That could mean improvements in infrastructure to mitigate traffic congestion, expanding public transport availability, and potentially increasing work from home opportunities.”

Source: University of Illinois College of Agricultural, Consumer and Environmental Sciences