Auditory hallucinations are likely the result of abnormalities in two brain processes: a “broken” corollary discharge that fails to suppress self-generated sounds, and a “noisy” efference copy that makes the brain hear these sounds more intensely than it should. That is the conclusion of a new study published October 3rd in the open-access journal PLOS Biology by Xing Tian, of New York University Shanghai, China, and colleagues.
Patients with certain mental disorders, including schizophrenia, often hear voices in the absence of sound.
Patients may fail to distinguish between their own thoughts and external voices, resulting in a reduced ability to recognise thoughts as self-generated.
In the new study, researchers carried out electroencephalogram (EEG) experiments measuring the brain waves of twenty patients diagnosed with schizophrenia with auditory hallucinations and twenty patients diagnosed with schizophrenia who had never experienced such hallucinations.
In general, when people are preparing to speak, their brains send a signal known as “corollary discharge” that suppresses the sound of their own voice.
However, the new study showed that when patients with auditory hallucinations were preparing to speak a syllable, their brains not only failed to suppress these internal sounds, but had an enhanced “efference copy” response to internal sounds other than the planned syllable.
The authors conclude that impairments in these two processes likely contribute to auditory hallucinations and that targeting them in the future could lead to new treatments for such hallucinations.
The authors add, “People who suffer from auditory hallucinations can ‘hear’ sounds without external stimuli. A new study suggests that impaired functional connections between motor and auditory systems in the brain mediate the loss of ability to distinguish fancy from reality.”
A small University at Buffalo clinical trial has found that at low doses, lithium aspartate is ineffective in treating the fatigue and brain fog that is often a persistent feature of long COVID; however, a supplemental dose-finding study found some evidence that higher doses may be effective.
Published in JAMA Network Open, the study was led by Thomas J. Guttuso, Jr., MD, professor of neurology in the Jacobs School of Medicine and Biomedical Sciences at UB and a physician with UBMD Neurology.
“It’s a negative study with a positive twist,” Guttuso concludes.
Because long COVID is believed to stem from chronic inflammation and lithium has known anti-inflammatory actions, Guttuso had recommended that a patient of his try low-dose lithium for persistent long COVID symptoms. He was surprised when this patient reported a near full resolution of fatigue and brain fog within a few days of initiating lithium aspartate at 5mg a day.
Relief from symptoms
Based on this single case, Guttuso became interested in lithium aspartate as a potential treatment for long COVID and recommended it to other such patients.
According to Guttuso, 9 of 10 long COVID patients he treated with lithium aspartate 5-15mg a day saw very good benefit in terms of improvements to their fatigue and brain fog symptoms.
“Based on those nine patients, I had high hopes that we would see an effect from this randomized controlled trial,” says Guttuso. “But that’s the nature of research. Sometimes you are unpleasantly surprised.”
The randomised controlled trial showed no benefit from 10-15mg a day of lithium aspartate compared to patients receiving a placebo.
After one patient from the study subsequently increased the lithium aspartate dosage to 40mg a day and experienced a marked reduction in fatigue and brain fog symptoms, Guttuso decided to then conduct a dose-finding study designed to explore if a higher dose of lithium aspartate may be effective.
The three participants who completed the dose-finding study reported greater declines in fatigue and brain fog with the higher dose of 40-45mg per day. This was especially true in the two patients with blood lithium concentrations of 0.18 and 0.49mmol/L compared to one patient with a level of 0.10mmol/L who saw partial improvements.
“This is a very small number of patients, so these findings can only be seen as preliminary,” says Guttuso. “Perhaps achieving higher blood levels of lithium may provide improvements to fatigue and brain fog in long COVID.”
Dosage may be too low
He notes that it is possible the randomized controlled trial was ineffective because the dose of lithium aspartate that was used was too low.
“The take-home message is that very low dose lithium aspartate, 10-15 milligrams a day, is ineffective in treating the fatigue and brain fog of long COVID,” says Guttuso. “Perhaps we need to do another randomised controlled trial that uses higher lithium aspartate dosages that achieve blood lithium levels of 0.18-0.50mmol/L to determine if they could be effective.”
An estimated 17 million people have long COVID in the US, and worldwide the number is estimated at 65 million.
“There currently are no evidence-based therapies for long COVID,” says Guttuso. He hopes that the National Institutes of Health will view lithium as worth studying through a trial with higher dosages; the NIH is allocating an additional $500 million to study long COVID therapies that appear to be promising.
Guttuso adds that if a subsequent randomised controlled trial finds that higher dosages of lithium aspartate are effective, long COVID patients would still need to discuss taking it with their health care providers; in addition, he says, if they do begin taking it at higher dosages, blood lithium levels should be monitored.
Oestrogen, the major female ovarian hormone, can trigger nerve impulses within milliseconds to regulate a variety of physiological processes. At Baylor College of Medicine, Louisiana State University and collaborating institutions, researchers discovered that oestrogen’s fast actions are mediated by the coupling of the oestrogen receptor-alpha (ER-alpha) with an ion channel protein called Clic1.
Clic1 controls the fast flux of electrically charged chloride ions through the cell membrane, which neurons use for receiving, conducting and transmitting signals. The researchers propose that interacting with the ER-alpha-Clic1 complex enables oestrogen to trigger fast neuronal responses through Clic1 ion currents. The study appeared in Science Advances.
“Oestrogen can act in the brain to regulate a variety of physiological processes, including female fertility, sexual behaviours, mood, reward, stress response, cognition, cardiovascular activities and body weight balance. Many of these functions are mediated by oestrogen binding to one of its receptors, ER-alpha,” said co-corresponding author Dr Yong Xu, professor of pediatrics – nutrition and associate director for basic sciences at the USDA/ARS Children’s Nutrition Research Center at Baylor.
Fast and slow
It is well known that, upon stimulation by oestrogen, ER-alpha enters the cell nucleus where it mediates the transcription of genes. This classical mode of action as a nuclear receptor takes minutes to hours.
“Oestrogen also can change the firing activity of neurons in a manner of milliseconds, but it was not clear how this happens,” Xu said. “In this case, it did not make sense to us that the minutes-long nuclear receptor function of ER-alpha was involved in such a rapid action. We explored the possibility that ion channels, proteins in the cell membrane that regulate the fast flux of ions, mediated oestrogen’s quick actions.”
In the current study, working with cell lines and animal models, the team searched for cell membrane proteins that interact with ER-alpha. They found that protein Clic1, for chloride intracellular channel protein-1, can physically interact with ER-alpha. Clic1has been implicated in the regulation of neuronal excitability, so the researchers considered it a candidate to mediate oestrogen-triggered fast actions.
“We discovered that oestrogen enhances Clic1-mediated ion currents, and eliminating oestrogen reduced such currents,” Xu said. “In addition, Clic1 currents are required for oestrogen to induce rapid responses in neurons. Also, disrupting the Clic1 gene in animal models blunted oestrogen regulation of female body weight balance.”
The findings suggest that other nuclear receptors could also interact with ion channels, a possibility the researchers look forward to studying in the future.
“This study was conducted with female mice. However, Clic1 is also present in males. We are interested in investigating its role in male physiology,” Xu said.
Chloride channels are not as well studied as other ion channels, such as potassium, sodium or calcium channels. “We are among the first to study the role Clic1 plays in female physiology,” Xu said. “We hope that our findings will inspire other groups in the field to expand these promising investigations.”
The effects of sustained drug abuse can manifest in many ways. Loss of memory and reduced cognitive functions are some of the effects that can persist for years. Neurobiologists at the University of California San Diego have now identified a mechanism in the brain that generates drug-induced cognitive impairments.
The researchers investigated how methamphetamine and phencyclidine (PCP or “angel dust”), which take effect by activating different targets in the brain, induce a similar reduction in cognitive ability. How could the same difficulties in memory emerge in response to drugs that trigger different actions in the brain?
The results of this investigation, led by Assistant Project Scientist Marta Pratelli in Professor Nicholas Spitzer’s laboratory, appear in Nature Communications. They showed that meth and PCP caused neurons to change the way they communicate through a process known as neurotransmitter switching.
Neurotransmitter switching is a form of brain plasticity, an evolving area of research investigating how the brain changes function and structure in response to experience. In recent years, Spitzer and his colleagues have also identified roles for neurotransmitter switching in autism spectrum disorder, post-traumatic stress disorder and in exercise.
Examining the cerebral cortex of mice, the investigators found that meth and PCP each caused a switch from the excitatory neurotransmitter glutamate to the inhibitory neurotransmitter GABA (gamma-aminobutyric acid) in the same neurons in the prelimbic region, an area of the frontal cortex involved in executive functions. This switch was linked to a decrease in memory task performance since drug-treated mice performed well in the tasks when the expression of GABA was blocked.
Further experiments showed that even after repeated exposure to the drugs, the researchers were able to reverse this neurotransmitter switch using molecular tools to locally decrease the brain’s electrical activity or using clozapine, an antipsychotic drug. Each of these treatments reversed the memory loss, restoring the performance of mice in the cognitive tasks.
“These results suggest that targeted manipulation of neuronal activity may be used to ameliorate some of the negative effects of repeated drug abuse,” said Pratelli.
In this new study, the researchers found that a drug-induced increase in the release of dopamine, a neurotransmitter involved in reward, and an increase in the electrical activity of neurons in the cerebral cortex, were required to produce the neurotransmitter switch.
“This study reveals a shared and reversible mechanism that regulates the appearance of cognitive deficits upon exposure to different drugs,” said Spitzer.
The researchers note in their paper that a deeper understanding of brain mechanisms tied to loss of memory from drug use could boost prospects for new treatments, not only resulting in therapy for meth and PCP consumption, but for other disorders as well.
Ischaemic and haemorrhagic stroke. Credit: Scientific Animations CC4.0
People who have had a stroke may be more likely to sleep too much or too little compared to those without prior stroke, according to a study published in Neurology®, the medical journal of the American Academy of Neurology. The study does not prove that stroke causes abnormal sleep; it only shows an association.
“Sleeping the right amount is considered essential for ideal brain and heart health,” said study author Sara Hassani, MD, of Duke University School of Medicine in Durham, North Carolina, and member of the American Academy of Neurology. “We know that abnormally long or short sleep after stroke can affect recovery and deteriorate quality of life, so these results should prompt us to screen for these issues and look at how we can help people improve their sleep habits.”
The study involved 39 559 participants, of whom 1572 had a stroke and 37 987 without stroke history. Every two years, participants were asked how much sleep they usually get at night on weekdays or workdays. Sleep duration was divided into three categories: short, less than six hours; normal, six to eight hours; and long, eight or more hours of sleep.
Researchers looked at how often participants had normal sleep, defined as six to eight hours. Normal sleep duration was less common for people who had a stroke than for those with no prior stroke for all age groups with 32% vs 54% for people age 18-44; 47% vs 55% for people age 45-64; and 45% vs. 54% for people over age 65.
After adjusting for factors that could affect sleep such as age, weight and high blood pressure, researchers found people who had a stroke were 54% more likely to report more than eight hours of sleep per night compared to those without stroke. Those with stroke were 50% more likely to get less than six hours of sleep per night when compared to those without stroke.
“In previous research, stroke has been linked to abnormal sleep, in particular sleep apnea,” said Hassani. “Conditions like insomnia and excessive sleepiness are common in stroke patients and may occur as a direct or indirect consequence of stroke itself. Future research should explore the links between stroke and duration of sleep and determine the effect of sleep duration on outcomes after stroke.”
One limitation of the study was that hours of sleep were self-reported, so participants may not have remembered accurately how much they slept.
Cognionics, founded by bioengineering alumnus Mike Yu Chi, has developed a wearable EEG headset that’s comparable to state of the art laboratory equipment. Credit: UC San Diego
Since the first recording in July 1924, human electroencephalography (EEG) has been integral to our understanding of brain function and dysfunction: most significantly in the clinical diagnosis of epilepsy, where the analysis of the EEG signal meant that a condition previously seen as a personality disorder was quickly redefined as a disorder of brain activity.
Now, a century on, more than 500 experts from around the globe have been asked to reflect on the impact of this groundbreaking methodology, as well as on the challenges and priorities for the future.
A survey led by University of Leeds academics, saw respondents – with 6685 years of collective experience – presented with possible future developments for EEG, ranging from those deemed ‘critical to progress’ to the ‘highly improbable,’ and asked to estimate how long it might be before they were achieved. The results are published in the journal Nature Human Behaviour.
Futuristic innovations
The list features an array of fascinating, futuristic innovations that experts believe could be achieved within a generation. This includes using EEG to enhance cognitive performance; early detection of learning disabilities; widespread use as a lie detector; and use as a primary communication tool for those with severe motor disabilities and locked-in syndrome.
Real-time, reliable diagnosis of brain abnormalities such as seizures or tumours is believed to be just 10-14 years away, while the probability of reading the content of dreams and long-term memories is judged to be more than 50 years away by some experts, but dismissed by many as closer to science fiction than reality.
It may be surprising to many that, according to the survey, within a generation we could all be carrying around our own, personal, portable EEG.
The paper’s co-author Dominik Welke, Research Fellow in Leeds’ School of Psychology, said: “They could really become something like a smartphone: where almost everybody has access to them and can use them daily – ideally improving their life by providing meaningful insight into physiological factors.”
He added: “One such positive, potential future use of EEG technology could be vigilance control for drivers or pilots. These work-safety systems could assist the user in identifying if they were falling asleep, then wake them up or tell the co-pilot they need to take over.”
They could really become something like a smartphone: where almost everybody has access to them and can use them daily
Dominik Welke, Research Fellow at the University
The hardware involved in recording EEG is relatively basic, remaining unchanged – in principle – since it was first used by psychiatrist Hans Berger in Germany on July 6, 1924. What has drastically changed since then is the analysis of – and what we can do with – the now digitally-recorded data.
Consisting of just electrodes and an amplifier, EEG systems are becoming increasingly cheap to produce, as well as more portable and user-friendly. Coupled with its non-invasive nature, there is little to prevent it from becoming more accessible to a wider audience.
Reducing health inequalities
While the prospect of EEG technology being widely used in gaming and VR – predicted to be only around 20 years away – will thrill gamers, the truly exciting possibility for scientists and clinicians is that this increasing accessibility will allow them to engage with communities traditionally excluded from EEG research, crucially, in low-income countries that cannot afford more complex imaging technology.
Advances in AI-driven automation are also expected to improve and speed up analysis of complicated data.
Dr Welke said: “Looking ahead to the future: from the hardware side, it’s comparatively cheap and easy to produce, and from the analysis and software side, with these new computing technologies, all the puzzle pieces are there to really roll out EEG to a very large user base.
“As opposed to other methods out there – such as MRI, or implanted devices – EEG has the potential to make neuroimaging available to all the people in the world.”
I think that EEG, when combined with technologies such as AI and virtual reality, could radically transform the ways in which we interact with machines, and in doing so, play an extremely important role in science and society over the next 100 years
Faisal Mushtaq, Professor of Cognitive Science and the Director of the Centre for Immersive Technologies at the University
“EEG stands out as the most cost-effective and logistically feasible neuroimaging tool for worldwide use across diverse settings. This would help build a neuroscience that is inclusive and representative of the global population.
He added: “Our partners at the Global Brain Consortium are laying the foundations for increasing reach in this way and I expect this will unlock new opportunities for groundbreaking discoveries on the mechanisms of brain function.”
Ethical questions
Alongside the optimism that emerging technologies are opening exciting new possibilities for EEG, the experts consulted also sounded a note of caution, with concerns that ranged from a lack of adherence to agreed standards and protocols to ethical questions created by novel commercial applications and the lure of ‘neuroenhancement’.
Dr Welke said: “I’m sure some of the multi-national tech companies might be very interested in rolling out EEG or other neuroimaging technology, just to get more information on their users that hints at their preferences and emotions 24 hours a day. But should it be used in this way?
“There are obvious concerns around cognitive freedom and mental privacy. This feeds back into the importance of ‘responsibility’ – the fact that new ways of using a technology are also likely to raise new ethical questions.”
Another objective of the survey was to identify the priorities of the EEG community for guiding future efforts. Participants rated how important major developments and advancements in various domains of EEG research would be to their work.
Professor Mushtaq said: “I think that EEG, when combined with technologies such as AI and virtual reality, could radically transform the ways in which we interact with machines, and in doing so, play an extremely important role in science and society over the next 100 years.
“But to ensure this, the neuroscience community—from academic, clinical and industry settings—must commit to promoting robust, ethical, inclusive, and sustainable practices that will help realise its enormous potential.”
The work was conducted by more than 90 authors, ranging from early career researchers to eminent figures in the field, collectively known as the EEG100 consortium.
It started out as a partnership between #EEGManyLabs – an international network of researchers from more than 30 countries assessing the replicability of the results of some of the most important and influential EEG experiments of psychological phenomena – and the Global Brain Consortium, a diverse network of brain researchers, clinicians and institutions committed to achieving improved and more equitable health outcomes worldwide.
“There are hurdles to overcome to employ EEG at a global scale, but by doing so, we can hopefully improve millions more lives.”
Dr Sadhana Sharma, Head of Bioscience for Health Strategy at the Biotechnology and Biological Sciences Research Council (BBSRC) – which funded the paper’s lead authors – said: “EEG technology has the potential to transform our day-to-day activities and how we diagnose and treat neurological conditions in the future, ensuring that insights into brain health are accessible to diverse populations worldwide.
“As we embrace developments in bioscience, our focus remains on fostering interdisciplinary collaborations that drive ethical, equitable and impactful advancements in brain science on a global scale.”
For decades, one of the most fundamental and vexing questions in neuroscience has been: what is the physical basis of consciousness in the brain? Most researchers favour classical models, based on classical physics, while a minority have argued that consciousness must be quantum in nature, and that its brain basis is a collective quantum vibration of ‘microtubule’ proteins inside neurons.
New research from Wellesley College published in eNeuro has yielded important experimental results relevant to this debate, by examining how anaesthesia affects the brain of rat models. Volatile anaesthetics are currently believed to cause unconsciousness by acting on one or more molecular targets including neural ion channels, receptors, mitochondria, synaptic proteins, and cytoskeletal proteins.
Anaesthetic gases including isoflurane bind to cytoskeletal microtubules (MTs) and dampen their quantum optical effects, potentially contributing to causing unconsciousness. This idea is supported by the observation that taxane chemotherapy, consisting of MT-stabilising drugs, reduces anaesthesia effectiveness during surgery in human cancer patients.
Lead researcher professor Mike Wiest and his research team found that when they gave rats the brain-penetrant MT–stabilising drug epothilone B (epoB), it took the rats significantly longer (69s) to fall unconscious under 4% isoflurane, as measured by loss of righting reflex (LORR).
The effect could not be accounted for by tolerance from repeated exposure to isoflurane.
Their results suggest that binding of the anesthetic gas isoflurane to MTs causes unconsciousness and loss of purposeful behaviour in rats (and presumably humans and other animals). This supports the idea that consciousness is a quantum state tied to MTs.
“Since we don’t know of another (ie, classical) way that anesthetic binding to microtubules would generally reduce brain activity and cause unconsciousness,” Wiest says, “this finding supports the quantum model of consciousness.”
It’s hard to overstate the significance of the classical/quantum debate about consciousness, says Wiest, an associate professor of neuroscience at Wellesley. “When it becomes accepted that the mind is a quantum phenomenon, we will have entered a new era in our understanding of what we are,” he says. The new approach “would lead to improved understanding of how anaesthesia works, and it would shape our thinking about a wide variety of related questions, such as whether coma patients or non-human animals are conscious, how mysterious drugs like lithium modulate conscious experience to stabilize mood, how diseases like Alzheimer’s or schizophrenia affect perception and memory, and so on.”
More broadly, a quantum understanding of consciousness “gives us a world picture in which we can be connected to the universe in a more natural and holistic way,” Wiest says. Wiest plans to pursue future research in this field, and hopes to explain and explore the quantum consciousness theory in a book for a general audience.
The impact of concussion while playing sport is different in those who don’t play professionally, says new research.
Sports-related concussions (SRC) may not be associated with long-term cognitive risks for non-professional athletes, a study led by a UNSW medical researcher suggests. In fact, study participants who had experienced an SRC had better cognitive performance in some areas than those who had never suffered a concussion, pointing to potential protective effects of sports participation.
Published in the Journal of Neurology, Neurosurgery and Psychiatry, the research reveals that individuals who reported experiencing any SRC during their lifetime had a marginally better cognitive performance than those who reported no concussions.
The study, a collaboration between researchers at UNSW Sydney, the University of Oxford, the University of Exeter and Harvard University, analysed data from more than 15 000 participants from the UK-based PROTECT study of 50- to 90-year-olds. This ongoing research aims to understand brain ageing and cognitive decline.
“Our findings suggest that there is something about playing sport, even though a person may experience concussion, that may be beneficial for long-term cognitive outcomes,” says lead author Dr Matt Lennon MD, PhD, at UNSW Medicine & Health.
“While it may be that those who play sports have had access to better education and more resources, we controlled for these factors in the analysis, so that doesn’t explain the result. We hypothesise that there may be physical, social and long-term behavioural effects of sport that may make for healthier adults in late-life,” said Dr Lennon.
Largest study of long-term effects of sports concussions
The study is the largest to date examining the long-term cognitive effects of SRC. Researchers collected lifetime concussion histories from 15 214 participants using the Brain Injury Screening Questionnaire.Among them, 6227 (39.5%) reported at least one concussion and 510 (3.2%) at least one moderate-severe concussion. On average, participants reported suffering their last head injury an average of 29 years prior to the study and their first head injury an average of 39 years earlier.
Researchers then compared cognitive function among individuals with 0, 1, 2 and 3+ SRCs and 0, 1, 2 and 3+ non-sports-related concussions (nSRCs) (i.e. from falls, car accidents, assaults and other causes). The SRC group showed 4.5 percentile rank better working memory than those who hadn’t experienced an SRC, and 7.9% better reasoning capacity than those without concussions.
Those with one SRC also had better verbal reasoning and attention compared to those with no SRC.
Conversely, participants with 3+ nSRCs – so things like accident and assaults – had worse processing speed and attention, and a declining trajectory of verbal reasoning with age.
“This study suggests that there could be long term benefits from sport which could outweigh any negative effects of concussions, which could have important implications for policy decisions around contact sport participation. It may also be that non-sports related head injuries lead to greater brain damage than sports-related concussions,” said senior author Professor Vanessa Raymont from the University of Oxford and Oxford Health NHS Foundation Trust.
The researchers say the study had some limitations.
“The retrospective design of the study, with elderly participants often recalling details of events over three decades in the past, may have affected the reporting of head injuries, even though we used a well-validated head injury screening tool,” said Prof. Raymont.
Study implications
The study looked at mid-to-late-life people who experienced SRC years earlier, whereas most other studies on SRC focus on younger athletes in the immediate period after their head injuries, where cognitive effects are most salient.
“While these results do not indicate the safety of any sport in particular, they do indicate that overall sports may have greater beneficial effects for long-term cognitive health than the damage it causes, even in those who have experienced concussion,” said Dr Lennon.
“This finding should not be overstated – the beneficial effects were small and in people who had two or more sports-related concussions there was no longer any benefit to concussion. Additionally, this study does not apply to concussions in professional athletes whose head injuries tend to be more frequent, debilitating and severe.”
Anne Corbett, Professor at Exeter University and the lead investigator of the PROTECT study, said: “What we see emerging is a completely different profile of brain health outcomes for people who have concussions as a result of sport compared to those that are not related to sport. Concussions that occur during sport do not lead to brain health concerns whereas other concussion types do, especially when people experience multiple concussions. In fact, people who take part in sport seem to have better brain health regardless of whether they have had a concussion whilst taking part or not.”
Types of tumour cells. Credit: Scientific Animations CC4.0
A Ludwig Cancer Research study has discovered that recurrent tumours of the aggressive brain cancer glioblastoma multiforme (GBM) grow out of the fibrous scars of malignant predecessors destroyed by interventions such as radiotherapy, surgery and immunotherapy.
Led by Ludwig Lausanne’s Johanna Joyce, Spencer Watson and alumnus Anoek Zomer and published in the current issue of Cancer Cell, the study describes how these scars enable the regrowth of tumours and identifies drug targets to sabotage their malignant support. It also demonstrates the efficacy of such combination therapies in preclinical trials using mouse models of GBM.
“We’ve identified fibrotic scarring as a key source of GBM resurgence following therapy, showing how it creates a protective niche for the regrowth of the tumor,” said Joyce. “Our findings suggest that blocking the process of scarring in the brain by adding anti-fibrosis agents to current treatment strategies could help prevent glioblastoma from recurring and improve the outcomes of therapy.”
There is a great need for such interventions. GBM is the most common and aggressive form of brain cancer in adults. Despite considerable effort to develop effective therapies for the cancer, the average life expectancy of patients remains around 14 months following diagnosis.
The origins of the current study date back to 2016, when the Joyce lab reported in the journal Science its examination in mouse models of strategies to overcome resistance to a promising immunotherapy for the treatment of GBM. That experimental therapy, which inhibits signalling by the colony stimulating factor-1 receptor (CSF-1R) and currently in clinical trials, targets immune cells known as macrophages and their brain-resident versions, microglia, both of which are manipulated by GBM cells to support tumour growth and survival.
The Joyce lab has demonstrated that CSF-1R inhibition reprograms these immune cells into an anti-tumour state and so induces significant tumour regression. Yet, as the Science study showed, about half the mice show relapse following an initial response to the therapy. “What was most remarkable about that observation was that every single time a brain tumour recurred following immunotherapy, it regrew right next to a scar that had formed at the original site of a tumour,” said Joyce.
In the current study, Joyce, Watson, Zomer and their colleagues examined tumour samples obtained from patients undergoing GBM therapy and showed that fibrotic scarring occurs following therapy in humans as well – and that it is similarly associated with tumour recurrence. They also showed that the fibrotic scarring occurs in response to not only immunotherapy but also following the surgical and radiological removal of tumours.
To explore how fibrosis contributes to relapse, the researchers applied an integrated suite of advanced technologies to analyze the cellular and molecular geography of the scars and the microenvironment of resurgent tumors.
These technologies include the analysis of global gene expression in individual cells, the comprehensive analysis of proteins in the tissues as well a workflow and AI-powered suite of analytical methods for the spatial analysis of tissues named hyperplexed immunofluorescence imaging (HIFI). Recently developed by Watson and colleagues in the Joyce lab, HIFI permits the simultaneous visualisation of multiple molecular markers in and around cells across broad cross-sections of tissues, enabling the generation of granular maps of the tumour microenvironment.
“Applied together, these advanced methods allowed us to see exactly how fibrotic scars form,” said Watson. “They revealed that the fibrosis serves as a kind of protective cocoon for residual cancer cells and pushes them into a dormant state in which they are largely resistant to therapy. We found that it also shields them from surveillance and elimination by the immune system.”
Integrated analyses of the tissue microenvironment following therapy revealed that the descendants of cells associated with tumor-feeding blood vessels become functionally altered to resemble fibroblasts—fiber-producing cells commonly involved in wound-healing. These perivascular-derived fibroblast-like (PDFL) cells fan out across the region previously occupied by the regressing tumor, where they mediate the generation of fibrotic scars. These cells, the researchers found, are especially activated by neuroinflammation and immune factors known as cytokines, most notably one called transforming growth factor-β (TGF-β).
“To see if targeting fibrotic scarring could improve therapeutic outcomes for GBM, we devised a treatment regimen using existing drugs to block TGF-β signaling and suppress neuroinflammation in combination with CSF-1R inhibition and evaluated it in preclinical trials using mouse models of GBM,” said Joyce. “We also timed these additional treatments to coincide with the period of maximal PDFL activation identified by our studies. Our results show that the drug combination inhibited fibrotic scarring, diminished the numbers of surviving tumor cells and extended the survival of treated mice compared to controls.”
The researchers suggest that approaches to limit fibrotic scarring could significantly improve outcomes for GBM patients receiving surgical, radiation or macrophage-targeting therapies. Additional research, they note, will likely yield even better drug targets for such combination therapies.
View of the spinal cord. Credit: Scientific Animations CC4.0
In a recent study published in Nature, researchers prevented T cells from causing the normal autoimmune damage that comes with spinal cord injury, sparing neurons and successfully aiding recovery in mouse models.
In spinal cord injury, the wound site attracts a whole host of peripheral immune cells, including T cells, which result in both beneficial and deleterious effects. Notably, antigen-presenting cells activate CD4+ T cells to release cytokines, ultimately leading to neuroinflammation and tissue destruction. This neuroinflammation is notably most pronounced during the acute phase of spinal cord injury. The problem is that these same T cells have a neuroprotective effect initially, only later developing autoimmunity and attacking the injury site.
Using single cell RNA sequencing, the researchers found that CD4+ T cell clones in mice showed antigen specificity towards self-peptides of myelin and neuronal proteins. Self-peptides have been implicated in a wide range of autoimmune conditions.
Using mRNA techniques, the researchers edited the T cell receptor, so that they shut off after a few days. In mouse models of spinal cord injury, they showed notable neuroprotective efficacy, partly as a result of modulating myeloid cells via interferon-γ.
Their findings provided insights into the mechanisms behind the neuroprotective function of injury-responsive T cells. This will help pave the way for the future development of T cell therapies for central nervous system injuries, and perhaps treatments for neurodegenerative diseases such as Alzheimer’s.