Month: July 2022

Psychological Resilience Aids Faster Return to Walking after Hip Fracture

Carers help an old man to walk
Photo by Kampus Productions on Pexels

In a study of 210 community-dwelling older adults who had surgery following hip fracture, participants who reported feeling high levels of psychological resilience were later able to walk faster and longer than those feeling less resilient. The results, published in the Journal of the American Geriatrics Society, will help the development of interventions such as targeted exercise programmes.

Walking capacity is a critically important outcome following hip fracture, in that it serves both as a surrogate measure of functional ability and physical health more globally, as well as a predictor of future survival, institutionalisation, social interaction and community engagement. Moreover, poor recovery after hip fracture causes considerable suffering for patients and imposes a financial burden on the social and health care sector. In 29%–50% of cases, older adults with hip fracture do not reach their pre-fracture levels of functioning a year after the fracture. Poorer pre-fracture function, greater cognitive impairment, greater co-morbidity burden, poorer social support, and poorer nutrition, have been identified as predictors for diminished post-fracture recovery.

Psychological resilience was measured through a questionnaire provided at the start of the study, and they assessed walking capacity at the start as well as 16 weeks later.

“We believe these results support opportunities to improve walking capacity following hip fracture in older adults by devising multicomponent interventions combining targeted exercise with psychological resilience-enhancing programmes,” said corresponding author Richard H. Fortinsky, PhD, of the University of Connecticut School of Medicine and the UConn Center of Aging.

Source: Wiley

Antibiotics Exposure in Childhood Linked to Later Allergies and Asthma

Young girl sneezing
Photo by Andrea Piacquadio on Unsplash

Early exposure to antibiotics kills healthy bacteria in the digestive tract, possibly leading to asthma and allergies, according to a series of experiments in mouse models.

The experiments, reported in Mucosal Immunology, have provided the strongest evidence so far that the long-observed connection between antibiotic exposure in early childhood and later development of asthma and allergies is causal.

“The practical implication is simple: avoid antibiotic use in young children whenever you can because it may elevate the risk of significant, long-term problems with allergy and/or asthma,” said senior author Martin Blaser at Rutgers University.

In the study, the researchers noted that antibiotics, which are “among the most used medications in children, affect gut microbiome communities and metabolic functions. These changes in microbiota structure can impact host immunity.”

In the first part of the experiment, five-day-old mice received water, azithromycin or amoxicillin. After the mice matured, researchers exposed them to a common allergen derived from house dust mites. Mice that had received either of the antibiotics, especially azithromycin, exhibited elevated rates of immune responses – ie, allergies.

The second and third parts of the experiment tested whether early exposure to antibiotics (but not later exposure) causes allergies and asthma by killing some healthy gut bacteria that support proper immune system development.

Lead author Timothy Borbet first transferred bacteria-rich faecal samples from the first set of mice to a second set of adult mice with no previous exposure to any bacteria or germs. Some received samples from mice given azithromycin or amoxicillin in infancy. Others received normal samples from mice that had received water.

Mice that received antibiotic-altered samples were no more likely than other mice to develop immune responses to house dust mites, just as people who receive antibiotics in adulthood are no more likely to develop asthma or allergies than those who don’t.

Things were different, however, for the next generation. Offspring of mice that received antibiotic-altered samples reacted more to house dust mites than those whose parents received samples unaltered by antibiotics, just as mice that originally received antibiotics as babies reacted more to the allergen than those that received water.

“This was a carefully controlled experiment,” said Blaser. “The only variable in the first part was antibiotic exposure. The only variable in the second two parts was whether the mixture of gut bacteria had been affected by antibiotics. Everything else about the mice was identical.

Blaser added that “these experiments provide strong evidence that antibiotics cause unwanted immune responses to develop via their effect on gut bacteria, but only if gut bacteria are altered in early childhood.”

Source: Rutgers University

Novel Drug Shown to Repair Damage after Stroke

MRI images of the brain
Photo by Anna Shvets on Pexels

A pioneering new study from the University of Cincinnati shows promise that a new drug may help repair damage caused by strokes. The preclinical study appears in the journal Cell Reports.

Currently, there are no FDA approved drugs to repair the damage caused by a stroke. The study found that the new drug, NVG-291-R, enables nervous system repair and significant functional recovery in an animal model of severe ischaemic stroke. Deleting the gene for the drug’s molecular target also shows similar effect on neural stem cells. The drug has also proven to be safe and well-tolerated in volunteers with multiple sclerosis.

“We are very excited about the data showing significant improvement in motor function, sensory function, spatial learning and memory,” said Agnes (Yu) Luo, PhD, associate professor UC and the study’s senior author.

Prof Luo said the drug would be a “substantial breakthrough” if the early results translate into clinical settings. Further study and validation of results from independent groups will be needed to determine if the drug is similarly effective to repair the damage of ischaemic strokes in human patients. Additional studies will be needed to research if NVG-291-R effectively repairs damage caused by haemorrhagic strokes in both animal models and human patients.

“Most therapies being researched today primarily focus on reducing the early damage from stroke,” Assoc Prof Luo said. “However, our group has focused on neurorepair as an alternative and now has shown that treatment with NVG-291-R not only results in neuroprotection to reduce neuronal death but also robust neuroreparative effects.”

The drug proved to be effective even when treatment began as late as seven days after the stroke’s onset.

“The only current FDA-approved drug for treatment of stroke does not repair damage and must be administered within 4.5 hours of stroke onset.” Luo said. “Most therapies being researched need to be applied within 24–48 hours of a stroke’s onset. A product that works to repair damage from stroke even a week after symptom onset would change the paradigm for stroke treatment.”

Jerry Silver, PhD, co-author of the study and professor of neurosciences at CWRU’s School of Medicine, said the study showed the drug repaired damage in at least two ways: creating new neuronal connections and enhancing migration of new neurons derived from neuronal stem cells to the damage site.

“NVG-291-R’s ability to enhance plasticity was demonstrated by using staining techniques that clearly showed an increase in axonal sprouting to the damaged part of the brain,” Prof Silver said. “This enhanced plasticity is an excellent validation of the same powerful mechanisms that we and other researchers were able to demonstrate using NVG-291-R in spinal cord injury.”

Source: University of Cincinnati

Can a Banana per Day Keep the Oncologist Away?

Photo by Mike Dorner on Unsplash

A trial in people with hereditary colon cancer has shown that daily supplements of resistant starch, equivalent to a slightly green banana per day, had a major preventative effect against many cancers. Published in Cancer Prevention Research, the findings showed that, while bowel cancers were unaffected, the supplement reduced cancers in other parts of the body by more than half.

This effect was particularly pronounced for upper gastrointestinal cancers including oesophageal, gastric, biliary tract, pancreatic and duodenum cancers. What is even more remarkable is that the effects lasted for 10 years after the participants stopped taking the supplements.

The CAPP2 trial involved almost 1000 patients with Lynch syndrome from around the world and revealed that a regular dose of resistant starch, also known as fermentable fibre, taken for an average of two years, cut their risk for many cancers.

The present study is a planned double blind 10 year follow-up, supplemented with comprehensive national cancer registry data for up to 20 years in 369 of the participants.

A previous study as part of the same trial and published in The Lancet revealed that aspirin reduced cancer of the large bowel by 50%.

“We found that resistant starch reduces a range of cancers by over 60%. The effect was most obvious in the upper part of the gut,” explained Professor John Mathers at Newcastle University. “This is important as cancers of the upper GI tract are difficult to diagnose and often are not caught early on.

“Resistant starch can be taken as a powder supplement and is found naturally in peas, beans, oats and other starchy foods. The dose used in the trial is equivalent to eating a daily banana; before they become too ripe and soft, the starch in bananas resists breakdown and reaches the bowel where it can change the type of bacteria that live there.

“Resistant starch is a type of carbohydrate that isn’t digested in your small intestine, instead it ferments in your large intestine, feeding beneficial gut bacteria – it acts in effect, like dietary fibre in your digestive system. This type of starch has several health benefits and fewer calories than regular starch. We think that resistant starch may reduce cancer development by changing the bacterial metabolism of bile acids and to reduce those types of bile acids that can damage our DNA and eventually cause cancer. However, this needs further research.”

Professor Sir John Burn, from Newcastle University and Newcastle Hospitals NHS Foundation Trust who ran the trial with Prof Mathers, said: “When we started the studies over 20 years ago, we thought that people with a genetic predisposition to colon cancer could help us to test whether we could reduce the risk of cancer with either aspirin or resistant starch.

“Patients with Lynch syndrome are high risk as they are more likely to develop cancers so finding that aspirin can reduce the risk of large bowel cancers and resistant starch other cancers by half is vitally important.

“Based on our trial, NICE now recommend Aspirin for people at high genetic risk of cancer, the benefits are clear – aspirin and resistant starch work.”

Between 1999 and 2005, nearly 1000 participants began either taking resistant starch in a powder form every day for two years or aspirin or a placebo.

At the end of the treatment stage, there was no overall difference between those who had taken resistant starch or aspirin and those who had not. However, the research team anticipated a longer-term effect and designed the study for further follow-up.

During follow-up, only five new cases of upper GI cancers were diagnosed among the 463 participants who had taken the resistant starch compared with 21 among the 455 who were on the placebo.

The team are now leading the international trial, CaPP3, involving more than 1800 participants with Lynch syndrome to look at whether smaller, safer doses of aspirin can be used to help reduce the cancer risk.

Source: Newcastle University

Shortage of Blood Test Tubes Prompts Saving Efforts

Blood sample being drawn
Photo by Hush Naidoo Jade Photography on Unsplash

A pandemic-related shortage of a mundane item, ‘blue top’ blood test tubes used toollect blood samples from patients, has caused headaches for health systems worldwide.

But it may also have a silver lining: A lesson in how to reduce unneeded medical tests, whether or not there’s a shortage, according to a new study published in JAMA Internal Medicine.

The shortage gave researchers a chance to see if alerting doctors at the moment they’re placing an order could encourage them to seek a test only when results will immediately affect care.

In the new study, an alert led to a nearly immediate 29% drop in orders for one common test. The reduced level persisted for months.

“This shows that small interventions can make a big difference, and suggests the potential for other types of low-value care to benefit from a similar intervention,” says lead author Madison Breeden, MD, who conducted the study during her year as chief resident of Quality and Patient Safety. She’s already exploring if the approach might reduce unnecessary antibiotic prescriptionse

Breeden and her colleagues describe what happened in spring 2021 when University of Michigan Health supply chain and pathology experts began worrying about a potential shortage of ‘blue top’ tubes. The pandemic had created very high demand for the chemical the tubes contain: sodium citrate, which stabilises blood samples until a laboratory team can analyse three blood clotting-related properties, called PT, INR and PTT.

After emailing all providers, U-M Health added a ‘best practices alert’ to doctors’ test-ordering electronic system. They could still order PT/INR/PTT tests, but were asked for “thoughtful restraint in reflexive ordering.”

The alert began popping up a month before the FDA issued an official shortage notice and the issue got widespread attention. The shortage continues today and has grown to other types of tests.

The researchers looked at what happened for six months after the alert began appearing at U-M Health, and compared it with data from six months before.

“There are very important reasons to order this test in some patients, for instance before an operation or when managing certain conditions and treatments,” Breeden explains. “But it may also be part of a standard order set that’s put in during an emergency department visit and continues to be ordered repeatedly after the patient is admitted to the hospital, even though the results won’t change their care.” For such patients, a one-time test might be indicated, but not repeated testing.

Busy doctors entering orders for tests don’t tend to think about the supplies and people power needed to carry out those tests, Breeden notes. In the face of a shortage, or of strong evidence that a test is often over-ordered, an alert could help prioritize the tests for those who need them most.

Canadian experts have actually flagged PT/INR/PTT tests as a target for reducing unnecessary care, through the Choosing Wisely program. So has the American Society for Clinical Laboratory Science, a medical professional group.

Evolution of Lactose Tolerance Driven by Starvation and Disease

Source: Pixabay CC0

The ability to digest lactose is thought to have evolved in concert with milk entering the diet where dairy farming was commonplace, but a new study published in Nature paints a much grimmer picture: starvation and disease appeared to drive its spread through European populations.

Five thousand years ago virtually all humans were (like all other mammals) lactose intolerant, losing the ability to produce lactase to digest milk after weaning and suffering bloating, gas and diarrhoea as adults when they they drank milk.

Professor George Davey Smith, Director of the MRC Integrative Epidemiology Unit at the University of Bristol and a co-author of the study, said: “However, a genetic trait called lactase persistence has evolved multiple times over the last 10 000 years and spread in various milk-drinking populations in Europe, central and southern Asia, the Middle East and Africa. Today, around one third of adults in the world are lactase persistent.”

Using archaeological data, genetic samples and computer modelling, the team demonstrated that lactase persistence genetic trait was not common until around 1000 BCE, nearly 4000 years after it was first detected around 4700–4600 BCE.

“The lactase persistence genetic variant was pushed to high frequency by some sort of turbocharged natural selection. The problem is, such strong natural selection is hard to explain,” added Professor Mark Thomas, study co-author from University College London.

In order to establish how lactose persistence evolved, Professor Richard Evershed, the study’s leader, assembled a database of over 7000 animal fat residues from 13 181 pottery fragments. His findings showed that milk was used extensively in European prehistory, dating from the earliest farming nearly 9000 years ago, but its use waxed and waned in different areas and times.

To understand how this relates to the evolution of lactase persistence, the research team, led by Prof Thomas, tracked the presence of the lactase persistence gene using ancient DNA sequences from more than 1700 prehistoric European and Asian individuals. The first instance seen was after around 5000 years ago, and 2000 years later it was at appreciable frequencies and today is very common. Next, his team developed a new statistical approach to examine how well changes in milk use through time explain the natural selection for lactase persistence. Surprisingly, no association was found, challenging the long-held view the extent of milk use drove lactase persistence evolution.

Professor George Davey Smith’s team had been probing the UK Biobank data, comprising genetic and medical data for more than 300 000 living individuals, found only minimal differences in milk drinking behaviour between genetically lactase persistent and non-persistent people. Critically, the large majority of people who were genetically lactase non-persistent experienced no short or long-term negative health effects when they consume milk.

Professor Davey Smith added: “Our findings show milk use was widespread in Europe for at least 9000 years, and healthy humans, even those who are not lactase persistent, could happily consume milk without getting ill. However, drinking milk in lactase non-persistent individuals does lead to a high concentration of lactose in the intestine, which can draw fluid into the colon, and dehydration can result when this is combined with diarrhoeal disease.”

This can have implications for individuals who are unwell, according to Prof Smith. “If you are healthy and lactase non-persistent, and you drink lots of milk, you may experience some discomfort, but you not going to die of it. However, if you are severely malnourished and have diarrhoea, then you’ve got life-threatening problems. When their crops failed, prehistoric people would have been more likely to consume unfermented high-lactose milk – exactly when they shouldn’t.”

To test these ideas, Prof Thomas’s team applied indicators of past famine and pathogen exposure into their statistical models. Their results clearly supported both explanations – the lactase persistence gene variant was under stronger natural selection when there were indications of more famine and more pathogens.

The authors concluded: “Our study demonstrates how, in later prehistory, as populations and settlement sizes grew, human health would have been increasingly impacted by poor sanitation and increasing diarrheal diseases, especially those of animal origin. Under these conditions consuming milk would have resulted in increasing death rates, with individuals lacking lactase persistence being especially vulnerable. This situation would have been further exacerbated under famine conditions, when disease and malnutrition rates are increased. This would lead to individuals who did not carry a copy of the lactase persistence gene variant being more likely to die before or during their reproductive years, which would push the population prevalence of lactase persistence up.

“It seems the same factors that influence human mortality today drove the evolution of this amazing gene through prehistory.”

Source: University of Bristol

SARS-CoV-2 Variants are Evolving to Evade Human Interferons

SARS-CoV-2 infecting a human cell
Infected cell covered with SARS-CoV-2 viruses. Source: NIAID

Researchers have investigated how antiviral proteins called interferons interact with SARS-CoV-2. The study, published in PNAS, focuses on how the innate immune system defends against this coronavirus, which appears to be adapting to evade this interferon response.

The study was the result of a collaborative effort, including the laboratories of Mario Santiago, PhD, associate professor of medicine and Eric Poeschla, MD, professor of medicine, both at the University of Colorado School of Medicine.

While the adaptive arm of the immune system robustly deals with infection by generating antibodies and T cells, the innate arm forms an earlier, first line of defence by recognising conserved molecular patterns in pathogens.

“SARS-CoV-2 just recently crossed the species barrier into humans and continues to adapt to its new host,” said Prof Poeschla. “Much attention has deservedly focused on the virus’s serial evasions of neutralising antibodies. The virus seems to be adapting to evade innate responses as well.”

The type I Interferon system is a major player in antiviral defence against all kinds of viruses. Virus-infected cells release type I interferons (IFN-α/β), which warn the body of the intrusion. Secreted interferons cause susceptible cells to express powerful antiviral mechanisms to limit viral growth and spread. The interferon pathway could significantly reduce the levels of virus initially produced by an infected individual.

“They are clinically viable therapeutic agents that have been studied for viruses like HIV-1 for years,” explained Prof Santiago. “Here we looked at up to 17 different human interferons and found that some interferons, such as IFNalpha8, more strongly inhibited SARS-CoV-2. Importantly, later variants of the virus have developed significant resistance to their antiviral effects. For example, substantially more interferon would be needed to inhibit the omicron variant than the strains isolated during the earliest days of the pandemic.”

The data suggests that COVID clinical trials on interferons, dozens of which are listed in, may need to be interpreted based on which variants were circulating when the study was conducted. Researchers say that future work to decipher which of SARS-CoV-2’s multitude of proteins might be evolving to confer interferon resistance may contribute in that direction.

Source: University of Colorado Anschutz Medical Campus

Clinical Trial of VR App Effective in Reducing Phobia Symptoms

Photo by JESHOOTS.COM on Unsplash

A New Zealand trial of a smartphone combining virtual reality (VR) with cognitive behavioural therapy (CBT) showed a 75% reduction in phobia symptoms after six weeks of the treatment programme. The results, published in the Australian and New Zealand Journal of Psychiatry, suggests an easily available treatment for the nearly one in 12 people who suffer common phobias such as that of heights or spiders.

The trial, led by Associate Professor Cameron Lacey, from the Department of Psychological Medicine, involved phobia patients using a smartphone app treatment programme called ‘oVRcome‘, which combines VR 360-degree video exposure therapy using headset alongside more traditional CBT. The company provides a simple headset into which users insert their own smartphone, turning it into a display.

“The improvements they reported suggests there’s great potential for the use of VR and mobile phone apps as a means of self-guided treatment for people struggling with often-crippling phobias,” Associate Professor Lacey says.

“Participants demonstrated a strong acceptability of the app, highlighting its potential for delivering easily accessible, cost-effective treatment at scale, of particular use for those unable to access in-person exposure therapy to treat their phobias.”

A total of 129 people took part in the six-week randomised, controlled trial, over May–December 2021, with a 12-week follow-up. Participants needed to be aged between 18–64 years, have a fear of either flying, heights, needles, spiders and dogs. Weekly questionnaires were emailed to record their progress, with access made available to a psychologist for any adverse effects.

For all phobias, participants showed comparable improvements in the Severity Measures for Specific Phobia scale. The average severity score decreased from 28/40 (moderate to severe symptoms) to 7/40 (minimal symptoms) after six weeks. There were no participant withdrawals due to intervention-related adverse events.

“The oVRcome app involves what’s called ‘exposure therapy’, a form of CBT exposing participants to their specific phobias in short bursts, to build up their tolerance to the phobia in a clinically-approved and controlled way,” Assoc Professor Lacey explained.

“Some participants reported significant progress in overcoming their phobias after the trial period, with one feeling confident enough to now book an overseas family holiday, another lining up for a COVID vaccine and another reporting they now felt confident not only knowing there was a spider in the house but that they could possibly remove it themselves.”

The programme used standard CBT components including psychoeducation, relaxation, mindfulness, cognitive techniques, exposure through VR, and a relapse prevention model. Participants were able to select their own exposure levels to their particular phobia from a large library of VR videos.

“This means the levels of exposure therapy could be tailored to an individual’s needs which is a particular strength. The more traditional in-person exposure treatment for specific phobias have a notoriously high dropout rate due to discomfort, inconvenience and a lack of motivation in people seeking out fears to expose themselves to. With this VR app treatment, triallists had increased control in exposure to their fears, as well as control over when and where exposure occurs,” said Assoc Professor Lacey.

The cost-effective availability of the app and headsets and the fact that multiple phobias were tested at once made this a novel trial, the researchers said. Most comparative VR studies to date have investigated high-end VR devices which are only available in research and limited clinical settings. One Dutch study examined a low-cost VR Dutch-language program using animated imagery that demonstrated improvement in fear-of-height symptoms, however this study only examined a single type of specific phobia.

Associate Professor Lacey says public demand to take part in the trial was unprecedented, demonstrating the increasing need and desire for phobia treatment in the community.

Source: University of Otago

Metastasis and Atherosclerosis Share an Underlying Mechanism

Source: Wikimedia CC0

Researchers have identified a key signalling molecule for cancer metastasis. one which is already known for its involvement in atherosclerosis, suggesting a possible treatment approach for both diseases simultaneously. The discovery was published in the International Journal of Cancer.

In order to become malignant, metastasising cancer, tumour cells undergo a series of transformations involving interactions with the immune system. Growing evidence exists that in tumour progression to metastasis, inflammation of blood vessel-lining endothelial cells is a key process.

A team of researchers led by Professor Kyoko Hida at Hokkaido University have discovered that, in malignant tumours, endothelial cells accumulate low-density lipoprotein (LDL) and neutrophils. Neutrophils are immune suppressor cells which are known to contribute to tumour progression.

Previous work by the team had revealed that blood vessels in malignant tumorus expressed a high level of proteoglycans, and it is known that cancerous tissue is inflamed – similar to what is seen in atherosclerosis.

The research team showed that metastasising tumors, in contrast to non-metastasising ones, accumulate proteoglycan molecules; these, in turn, attach to and accumulate LDL to the walls of blood vessels, where it becomes oxidised. There are also high levels of its receptor, LOX-1, in the blood vessel-lining endothelial cells of metastasising tumours. This, they found, causes these cells to produce inflammation signals that attract neutrophils. Using a mouse model, they proved that the suppression of LOX-1 can significantly reduce tumour malignancy, and also that LOX-1 overexpression caused an increase in signalling molecules attracting neutrophils.

This sequence of interactions observed in malignant tumours is not novel: it occurs in atherosclerosis. “Atherosclerosis and cancer appear to be completely different diseases, but they share several common pathophysiological features in the blood vessels,” said Prof Hida.

Though some questions remain, especially on the mechanism of how neutrophils contribute to cancer malignancy, this study is the first to explicitly prove the mechanistic commonalities between cardiovascular disease and cancer progression and trace the mechanism involving LDL accumulation and LOX-1 expression in in vivo tumour tissue.

“Our present study focused on the importance of LOX-1 in endothelial cells as a common factor between cancer and atherosclerosis,” Prof Hida explained. “The presence of neutrophils in tumours is a telltale sign of tumor progression.”

The study also points to a promising approach for treating and preventing malignant cancer (and cardiovascular disease) by targeting neutrophil recruitment to endothelial cells. Prof Hida concluded: “The number of patients with cancer who die not of cancer, but of cardiovascular events, is increasing. Targeting the LOX-1/oxidised LDL axis might be a promising strategy for the treatment of the two diseases concomitantly.”

Source: Hokkaido University