Year: 2023

For Stroke Recovery, Deep Brain Stimulation may Aid Rehabilitation

Deep brain stimulation illustration. Credit: NIH

A first-in-human trial of deep brain stimulation (DBS) for post-stroke rehabilitation patients has shown that using DBS to target the dentate nucleus – which regulates fine-control of voluntary movements, cognition, language, and sensory functions in the brain – is safe and feasible.

The EDEN trial (Electrical Stimulation of the Dentate Nucleus for Upper Extremity Hemiparesis Due to Ischemic Stroke) also shows that the majority of participants (9 of 12) demonstrated improvements in both motor impairment and function. Importantly, the study found that participants with at least minimal preservation of distal motor function at enrolment showed gains that almost tripled their initial scores.

Published in Nature Medicine, these findings build on more than a decade of preclinical work led by principal investigators Andre Machado, MD, PhD, and Kenneth Baker, PhD, at Cleveland Clinic.

“These are reassuring for patients as the participants in the study had been disabled for more than a year and, in some cases, three years after stroke. This gives us a potential opportunity for much needed improvements in rehabilitation in the chronic phases of stroke recovery,” said Dr Machado, patented the DBS method in stroke recovery. “The quality-of-life implications for study participants who responded to therapy have been significant.”

“We saw patients in the study regain levels of function and independence they did not have before enrolling in the research,” Dr Machado said. “This was a smaller study and we look forward to expanding as we have begun the next phase.”

The completed EDEN trial enrolled 12 individuals with chronic, moderate-to-severe hemiparesis of the upper extremity as a result of a unilateral middle cerebral artery stroke 12-to-36 months prior. There were no major complications throughout the study. Nine of the 12 participants improved to a degree that is considered meaningful in stroke rehabilitation.

Source: Cleveland Clinic

Why Blood Vessel Linings go Wrong and Contribute to Plaque Growth

Source: Wikimedia CC0

University of Virginia Health researchers probing the causes of coronary artery disease have identified why blood vessel lining, which usually secure plaques to stop them drifting, sometimes instead contribute to plaque buildup. The discovery, published in Circulation: Genomic and Precision Medicine, provides new targets for scientists looking for better ways to treat and prevent the disease.

“Smooth muscle cells that make up the bulk of our blood vessels play important roles in coronary artery disease. They undergo pathological transformations as the disease develops inside our arteries,” said researcher Mete Civelek, of the University of Virginia School of Medicine’s Center for Public Health Genomics and the Department of Biomedical Engineering.

“Our results point to a previously underappreciated role for metabolic pathways during this pathological transformation,” he said.

Civelek and his team wanted to unravel a longstanding mystery about the behaviour of smooth muscle cells during plaque formation. These cells, which line blood vessels, protect the body during plaque formation by building stabilising caps over the plaque that prevent the lesions from breaking loose and causing strokes.

But sometimes smooth muscle cells begin to accelerate the plaque development and spur the progression of the disease, scientists believe.

Civelek’s new discovery helps explain why. Noah Perry, a doctoral student on Civelek’s team, analysed smooth muscle cells collected from 151 heart transplant donors and used a sophisticated approach to identify genes responsible for the smooth muscle cells’ behaviour.

After initially identifying 86 groups of genes, the researchers focused in on 18 groups that could explain the mysterious behaviour. Their analysis suggested that the smooth muscle cells’ shift might stem from problems with how the cells use nitrogen and glycogen.

The researchers identified a particular sugar, mannose, that may be contributing to the problems, potentially even triggering them. But determining that, the scientists say, will require more research.

“The metabolic shift in the cells as they transition to a disease state can point to points of intervention and therapy,” said Perry, of UVA’s Department of Biomedical Engineering, the lead author of the study.

By better understanding what triggers the smooth muscle cells to become harmful, Civelek says, doctors may be able to develop ways to prevent that from happening. That could open the door to new ways to treat and prevent coronary artery disease.

“Coronary artery disease is still the leading cause of death worldwide,” Civelek said. “Although cholesterol-lowering therapies and blood pressure control have been very effective tools to prevent deaths from heart attacks, we still need more targets to reduce the suffering of patients and their families from this devastating disease.”

Source: University of Virginia

Study Tries Roasting out The Toxicants in Coffee

Photo by Mike Kenneally on Unsplash

Coffee is one of the world’s most popular beverages, but it also has potential health concerns, one of which is the production of foodborne toxicants such as acrylamide and furan during roasting process. A study published in Beverage Plant Research investigated ways to mitigate both contaminants in coffee by changing roasting parameters, including special procedures.

A study analysed a Vietnam Robusta grade 2 and a Brazil Arabica (unwashed) coffee with different roasting profiles (tangential, drum and hot air roasting) and roast degrees (light, medium and dark roast). Researchers accurately measured the acrylamide and furan derivative content in the samples by GC-MS. They found that acrylamide contents were highest in light roasts, and the content of furan and methylfurans were low in light roasts for both the Robusta and Arabica samples.

In addition, the study also explored the impact of special roasts, such as double roast or roasting with a sudden temperature change on acrylamide and furan content. The results showed that special roasts had no significant effect on the two contaminants. To sum up, these findings suggest that the coffee type and roasting process significantly influence the levels of these toxicants. Simultaneous mitigation of the effects of acrylamide and furan/methylfuran by changing the roasting parameters is not possible.

In conclusion, the research results show that both the type of coffee and its roasting profile have a substantial impact on the levels of acrylamide and furan, highlighting the possibility of regulating these toxicants through controlled roasting processes. However, simultaneous mitigation of these toxicants seems to be impossible. This study holds significant implications for the future of coffee production, potentially paving the way for safer and healthier consumption practices.

Source: EurekAlert!

Optimum Heart Rate for Fat Burning can Vary Widely among Individuals

The study uncovered individual variations in fat burning during exercise. Graphs of two people’s fat burning curves highlight differences in fat burning rates at varying exercise intensities and demonstrate that fatMAX falls outside the predicted ‘fat burning zone’. Credit: Hannah Kittrell, Mount Sinai Physiolab and AIMS Lab at Icahn Mount Sinai

The ‘fat burning zone’ on commercial exercise machines often does not line up the best heart rate for burning fat as it differs for each individual, Icahn School of Medicine at Mount Sinai researchers report.

Instead, the researchers said, clinical exercise testing (a diagnostic procedure to measure a person’s physiological response to exercise) may be a more useful tool to help individuals achieve intended fat loss goals. The study, which used a machine learning-based modelling approach, was published online in Nutrition, Metabolism and Cardiovascular Disease.

“People with a goal of weight or fat loss may be interested in exercising at the intensity which allows for the maximal rate of fat burning. Most commercial exercise machines offer a ‘fat-burning zone’ option, depending upon age, sex, and heart rate,” says lead author Hannah Kittrell, MS, RD, CDN, a PhD candidate at Icahn Mount Sinai. “However, the typically recommended fat-burning zone has not been validated, thus individuals may be exercising at intensities that are not aligned with their personalised weight loss goals.”

The term FATmax is sometimes used to represent the exercise intensity and associated heart rate at which the body reaches its highest fat-burning rate during aerobic exercise. At this point, fat is a significant fuel source and therefore this intensity may be of interest to those seeking to optimize fat loss during workouts.

As part of the study, the researchers compared heart rate at FATmax, as measured during a clinical exercise test, to predicted heart rate at percentages of maximal effort within the typically recommended ‘fat-burning zone’. In a sample of 26 individuals, the researchers found that there was poor agreement between measured and predicted heart rate, with a mean difference of 23 beats per minute between the two measures. This suggests that general recommendations for a ‘fat-burning zone’ may not provide accurate guidance.

Next, the researchers plan to study whether individuals who receive a more personalised exercise prescription demonstrate more weight and fat loss, as well as improvement of metabolic health markers that identify health risks like type 2 diabetes, obesity, and heart disease.

“We hope that this work will inspire more individuals and trainers to utilise clinical exercise testing to prescribe personalised exercise routines tailored to fat loss. It also emphasises the role that data-driven approaches can have toward precision exercise,” says senior author Girish Nadkarni, MD, MPH, Professor of Medicine at Icahn Mount Sinai.

Source: The Mount Sinai Hospital / Mount Sinai School of Medicine

In A First, Immunotherapy for Glioblastoma Successfully Tested in Mice

Photo by Kanashi ZD on Unsplash

Immunotherapy has dramatically improved survival against many cancers but efforts to use it against glioblastomas have to date proven fruitless. Now, Salk scientists have found the immunotherapy treatment anti-CTLA-4 leads to considerably greater survival of mice with glioblastoma. Furthermore, they discovered that this therapy was dependent on immune cells called CD4+ T cells infiltrating the brain and triggering the tumour-destructive activities of other immune cells called microglia, which permanently reside in the brain.

The findings, published in the journal Immunity, show the benefit of harnessing the body’s own immune cells to fight brain cancer and could lead to more effective immunotherapies for treating brain cancer in humans.

Glioblastoma, the most common and deadly form of brain cancer, grows rapidly to invade and destroy healthy brain tissue. The tumour sends out cancerous tendrils into the brain that make surgical tumour removal extremely difficult or impossible.

“There are currently no effective treatments for glioblastoma – a diagnosis today is basically a death sentence,” says Professor Susan Kaech, senior author and director of the NOMIS Center for Immunobiology and Microbial Pathogenesis. “We’re extremely excited to find an immunotherapy regimen that uses the mouse’s own immune cells to fight the brain cancer and leads to considerable shrinkage, and in some cases elimination, of the tumour.”

For some tumours, immunotherapy can be used, in which the body’s own immune cells to seek and destroy cancer cells, leading to strong, lasting anti-cancer responses for many patients. Kaech sought new ways of harnessing the immune system to develop more safe and durable treatments for brain cancer.

Her team found three cancer-fighting tools that have been somewhat overlooked in brain cancer research that may cooperate and effectively attack glioblastoma: an immunotherapy drug called anti-CTLA-4 and specialized immune cells called CD4+ T cells and microglia.

Anti-CTLA-4 immunotherapy works by blocking cells from making the CTLA-4 protein, which, if not blocked, inhibits T cell activity. It was the first immunotherapy drug designed to stimulate our immune system to fight cancer, but it was quickly followed by another, anti-PD-1, that was less toxic and became more widely used. Whether anti-CTLA-4 is an effective treatment for glioblastoma remains unknown since anti-PD-1 took precedence in clinical trials. Unfortunately, anti-PD-1 was found to be ineffective in multiple clinical trials for glioblastoma – a failure that inspired Kaech to see whether anti-CTLA-4 would be any different.

As for the specialized immune cells, CD4+ T cells are often overlooked in cancer research in favour of a similar immune cell, the CD8+ T cell, because CD8+ T cells are known to directly kill cancer cells. Microglia live in the brain full time, where they patrol for invaders and respond to damage – whether they play any role in tumour death was not clear. When treated with anti-CLA-4, mice with glioblastoma had longer lifespans than those receiving anti-PD-1.

Upon investigation, they found that after anti-CTLA-4 treatment, CD4+ T cells secreted a protein called interferon gamma that caused the tumour to throw up “stress flags” while simultaneously alerting microglia to start eating up those stressed tumour cells. As they gobbled up the tumour cells, the microglia would present scraps of tumour on their surface to keep the CD4+ T cells attentive and producing more interferon gamma, creating a cycle that lasts until the tumour is destroyed.

“Our study demonstrates the promise of anti-CTLA-4 and outlines a novel process where CD4+ T cells and other brain-resident immune cells team up to kill cancerous cells,” says co-first author Dan Chen, a postdoctoral researcher in Kaech’s lab.

To understand the role of microglia in this cycle, the researchers collaborated with co-author and Salk Professor Greg Lemke. For decades, Lemke has investigated critical molecules, called TAM receptors, used by microglia to send and receive crucial messages. The researchers found that TAM receptors told microglia to gobble up cancer cells in this novel cycle.

“We were stunned by this novel codependency between microglia and CD4+ T cells,” says co-first author Siva Karthik Varanasi, a postdoctoral researcher in Kaech’s lab. “We are already excited about so many new biological questions and therapeutic solutions that could radically change treatment for deadly cancers like glioblastoma.”

Connecting the pieces of this cancer-killing puzzle brings researchers closer than ever to understanding and treating glioblastoma.

“We can now reimagine glioblastoma treatment by trying to turn the local microglia that surround brain tumours into tumour killers,” says Kaech. “Developing a partnership between CD4+ T cells and microglia is creating a new type of productive immune response that we have not previously known about.”

Next, the researchers will examine whether this cancer-killing cell cycle is present in human glioblastoma cases. Additionally, they aim to look at other animal models with differing glioblastoma subtypes, expanding their understanding of the disease and optimal treatments.

Source: Salk Institute

Study Resolves Long-standing Question on Gating of Ion Channels

Source: CC0

Ion channels play a crucial role in many cellular processes, including neuronal communication, muscle contraction or cell proliferation. Most multi subunit ion channels exist in two functional states, either closed or open. During gating, one should expect that all subunits undergo conformational changes – but there are no intermediate conduction levels. To find out why, researchers from the University of Vienna and the Washington University in St. Louis created a smart model system. The study is currently published in Nature Communications.

Ion channels are membrane proteins that regulate the electrical activity of cells. In this study the scientific team investigated the inwardly rectifying potassium channel Kir2. This channel is crucial for maintaining a negative membrane potential in many cells. These channels are promising drug targets for treatment of cardiovascular diseases. To foster drug development, a detailed understanding of the gating mechanism is important.

Intelligent model system & innovative methods

“We designed a model system that allowed us to visualise the gating of individual subunits and track conductance changes,” explains Grigory Maksaev from the Washington University in St. Louis. As a model system, the inwardly rectifying potassium channel Kir2 was used. This channel is crucial for maintaining a negative membrane potential in many cells. “We introduced an acidic residue near the channel gate. This led to novel states, so-called sub-conductance states” explains Eva Plessl from the Department of Pharmaceutical Sciences, University of Vienna. The life times of these sub-states were long enough to resolve them experimentally. Each of the observed sub-states represents a distinct subunit conformation. Interestingly, the sub-state occupancy is titratable by pH. “This suggests that protonation or deprotonation of individual acidic residues causes this phenomenon,” explains Sun-Joo Lee from the Washington University in St. Louis.

Sour is…less conductive

“Molecular dynamics simulations with different protonation states of the acidic residue support this finding,” explains Anna Weinzinger from the Department of Pharmaceutical Sciences, University of Vienna. The study reveals that each subunit gating transition leads to conductance level changes. This suggests that for a fully open channel, all subunits must move together. “By designing a smart model system, we have answered a long-standing question about ion channel gating,” explains Colin Nichols from the Washington University in St. Louis.

Source: University of Vienna

Texting While Walking Increases the Risk of Slipping and Falling

Photo by Azat Satlykov on Unsplash

People are increasingly glued to their smartphones, texting even as they walk, which has inspired a wide range of studies: some have shown that they can multitask and navigate around obstacles while other have shown that they are more likely to walk into traffic. But how likely are they to avoid a fall if they slip? University of New South Wales (UNSW) researchers investigated this by simulating an environment with random slipping threats, and reported in the journal Heliyon that texting increases the risk of falling in response to walkway hazards.

“On any day it seems as many as 80% of people, both younger and older, may be head down and texting. I wondered: is this safe?” says senior author Matthew A. Brodie, a neuroscientist and engineer at the UNSW Graduate School of Biomedical Engineering. “This has made me want to investigate the dangers of texting while walking. I wanted to know if these dangers are real or imagined and to measure the risk in a repeatable way.”

The team recruited 50 UNSW undergraduate students from his “Mechanics of the Human Body” course for this experiment. Brodie and co-author Yoshiro Okubo invented a tiled hazard walkway at Neuroscience Research Australia’s gait laboratory, which halfway through had a tile that could be adjusted to slide out of place and cause a person stepping on it to slip. Students wore a safety harness and sensors that collected their motion data. They then were asked to go along the walkway either without texting or while typing “The quick brown fox jumps over the lazy dog.”

To better simulate the uncertainty of real life, students were only told that they may or may not slip. This allowed the researchers to study how texting pedestrians might anticipate and try to prevent a potential slip, such as by leaning forward.

“What surprised me is how differently people responded to the threat of slipping,” says Brodie. “Some slowed down and took a more cautious approach. Others sped up in anticipation of slipping. Such different approaches reinforce how no two people are the same, and to better prevent accidents from texting while walking, multiple strategies may be needed.”

Despite motion data showing that texting participants tried to be more cautious in response to a threat, this did not counteract their risk of falling. When participants went from leaning forwards (such as over a phone) to slipping backwards, their motion sensors showed an increase in the range of their ‘trunk angle’. Researchers used this number to measure whether the texting condition was making students more likely to fall, and they found that the average trunk angle range during a fall significantly increased if a student was texting.

Walking also caused the texters’ accuracy to decrease. The highest texting accuracy occurred when participants were seated, but accuracy decreased even as walking participants were cautioned about a potential slip that did not occur. The lowest accuracy, however, occurred in conditions where participants did slip.

The researchers note that young people may be more likely to take risks even if they are aware that texting and walking could increase their likelihood of falling. For that reason, the authors suggest that educational initiatives such as signs might be less effective in reaching this population. In addition to education, the researchers also suggest that phones could implement locking technology similar to what is used when users are driving. The technology could detect walking activity and activate a screen lock to prevent texting during that time. In future research, the team plans on looking into the effectiveness of this intervention.

Source: Science Daily

Genetic Mechanism Increases Resistance to the Antibiotic Albicidin by 1000-fold

Photo by Sangharsh Lohakare on Unsplash

A new analysis shows that infectious bacteria exposed to the promising antibiotic albicidin rapidly develop up to a 1000-fold increase in resistance via a gene amplification mechanism. Mareike Saathoff of Freie Universität Berlin, Germany, and colleagues presented these findings in the open access journal PLOS Biology.

Bacterial resistance to antibiotics is a growing problem associated with millions of deaths around the world every year. Understanding how bacteria evolve resistance is key to developing more effective antibiotics and strategies for using them.

In recent years, albicidin has emerged as a promising antibiotic capable of killing a wide range of bacterial species by disrupting their DNA replication. Researchers are working to develop new albicidin-based medications; yet, despite its promise, some bacteria are able to develop resistance to albicidin.

To further investigate albicidin resistance mechanisms, Saathoff and colleagues conducted a suite of experiments employing a broad set of tools, including RNA sequencing, protein analysis, X-ray crystallography, and molecular modeling. They found that two bacteria often associated with human infection, Salmonella typhimurium and Escherichia coli, develop resistance to albicidin when exposed to increasingly higher concentrations of the compound. Their analysis narrowed down the source of this resistance to an increase in the number of copies of a gene known as STM3175 (YgiV) in the bacterial cells, which is amplified in each new generation of cells as they multiply. STM3175 encodes a protein that interacts with albicidin in such a way that protects the bacteria from it.

Further experiments showed that the same albicidin-resistance mechanism is widespread among both pathogenic and harmless bacteria, including the microbes Vibrio vulnificus, which can infect wounds, and Pseudomonas aeruginosa, which can cause pneumonia and other infections. These findings could help inform the ongoing development of albicidin-based antibiotic strategies.

The authors add, “Our study reveals a gene duplication and amplification-based mechanism of a transcriptional regulator in Gram-negative bacteria, that mediates resistance to the peptide antibiotic albicidin.”

Source: Science Daily

Nose-picking Healthcare Workers Were More Likely to Get COVID

Photo by Ketut Subiyanto on Pexels

A study of healthcare workers (HCW) found that those who picked their nose were more likely to get COVID than the people who refrained from such explorations. The Dutch researchers published their probing results in the journal PLOS One.

In the early stages of the COVID pandemic, researchers noted a wide range of efforts to prevent the spread of SARS-CoV-2, such as the wearing of personal protective equipment and maintaining social distancing, especially in the hospital setting. Much research went into the impacts of, eg, wearing glasses on the effectiveness of masking, but little if any attention was paid to a widespread but secretive habit.

Sikkens and colleagues retrospectively surveyed healthcare workers at Amsterdam University Medical Centers were in December 2021 about their behaviours during the first and second waves of the pandemic. They matched these responses were matched against prospectively collected COVID test results at the hospitals from March to October 2020. The nose pickers were nearly three times more likely to catch COVID (17.3% vs 5.9%) than those who refrained at all costs. Surprising results were found for those HCWs who owned up to the habit.

Secret nose pickers can take some comfort in that 85% of the cohort admitted that they picked their nose either daily, weekly, or monthly, and nose pickers tended to be younger. More men picked their nose (90%) than women (83%), and doctors were the most likely to be among the nose-picking offenders: 100% of residents admitted to it, along with 91% of specialists.

Sikkens et al. noted that one limitation of the study was that nose pickers were not asked about “the depth of penetration and eating of boogers”.

Other behaviours such nail biting, having a beard were not associated with COVID infection, nor was wearing glasses, though it showed a relevant trend. Interestingly, nose picking frequency was not linked to difference in COVID infection risk; 27% of those who reported monthly picking, 35% among weekly pickers, and 32% of daily pickers.

Frequency of nose picking did not appear to be linked with any difference in COVID infection risk, with positive cases in 27% of those who reported monthly picking, 35% among weekly pickers, and 32% of daily pickers. No participants reported picking their nose every hour, thankfully.

One-third of the cohort reported nail biting, two-thirds wore glasses, and 31% of the men had beards.

A study strength was that SARS-CoV-2 positivity was determined by prospective longitudinal serological sampling, though this may not be generalisable to the current era of vaccines and circulating Omicron variants. The retrospective nature of the survey may have introduced recall bias.

Sikken et al. noted that it is surprising that SARS-CoV-2 transmission routes had been so thoroughly researched, yet simple behaviours had been overlooked. “Possibly this sensitive subject is still taboo in the health care profession. It is commendable we assume HCWs to not portray bad habits, yet we too are only human after all, as illustrated by the pivotal proportion of nose pickers in our cohort (84.5%).”

Inflammation Impedes the Development of Malaria Parasites

Photo by Ekamelev on Unsplash

Researchers have found that inflammation can slow down the development of malaria parasites in the bloodstream, which may lead to a new strategy for preventing or limiting severe disease.

The malaria-causing Plasmodium parasites invade and multiply within red blood cells. Studies have shown that the parasites can rapidly sense and respond to conditions within the host by intimately syncing with their internal body clocks. While it is known that the body’s nutrient levels and daily circadian rhythms affect the parasites’ development, little was known about the impact of host inflammation on the parasites.

This animal-model study, led by the Peter Doherty Institute for Infection and Immunity (Doherty Institute) and the Kirby Institute and published in the journal mBio, reveals that when the body’s immune system responds to inflammation it alters the plasma’s chemical composition, directly impeding the maturation of the Plasmodium parasites in the bloodstream.

University of Melbourne’s Associate Professor Ashraful Haque, a senior author of the paper, said this work highlights the captivating dynamic of the host-parasite relationship.

“First, we discovered that inflammation in the body prevented the early stage of the parasites from maturing. We also noticed that inflammation triggered significant changes in the composition of the plasma – we were actually quite surprised by the magnitude of these changes,” said Associate Professor Haque.

“As we dug deeper, we found substances in the altered plasma that, we believe, are what may inhibit parasite growth in the body. This work reveals a new mechanism that slows down the malaria parasite’s development in the bloodstream. Our research was done using animal models, so it would be really interesting to study if such inhibitory mechanisms occur in humans too.”

Dr David Khoury, co-senior author of the paper, said the scientists found a remarkable response by the parasites to the changes in their environment.

“Parasites residing in red blood cells rapidly sense and respond to their new environment, showing fascinating adaptability. Using cutting-edge genome sequencing technology, we observed that even after just four hours in this changed plasma, the parasites adjusted their genetic and protein activity, resulting in slower maturation within red blood cells. It’s almost like the parasites actively sense an inhospitable host environment, and as a result trigger a coping mechanism,” said Dr Khoury.

“We believe this is the first study to show that inflammation can change how individual parasites behave genetically in the body.”

Professor Miles Davenport, co-senior author of the paper, said this work on the interaction between systemic host inflammation and malaria parasite maturation offers several potential benefits.

“This study, while based on animal models, broadens our understanding of malaria. It provides a foundation for further investigations into the specific mechanisms involved in the modulation of parasite maturation by inflammation, and opens avenues for future studies to explore the identified inhibitory factors, genetic changes and their implications for malaria development,” said Professor Davenport.

Source: The Peter Doherty Institute for Infection and Immunity