Category: Neurology

In Sleep Deprivation, Attention Lapses Correspond with CSF Cleaning Flushes

New research shows attention lapses due to sleep deprivation coincide with a flushing of fluid from the brain — a process that normally occurs during sleep.

Anne Trafton | MIT News

Photo by Tim Gouw on Unsplash

Nearly everyone has experienced it: After a night of poor sleep, you don’t feel as alert as you should. Your brain might seem foggy, and your mind drifts off when you should be paying attention.

A new study from MIT reveals what happens inside the brain as these momentary failures of attention occur. The scientists found that during these lapses, a wave of cerebrospinal fluid (CSF) flows out of the brain – a process that typically occurs during sleep and helps to wash away waste products that have built up during the day. This flushing is believed to be necessary for maintaining a healthy, normally functioning brain.

When a person is sleep-deprived, it appears that their body attempts to catch up on this cleansing process by initiating pulses of CSF flow. However, this comes at a cost of dramatically impaired attention.

“If you don’t sleep, the CSF waves start to intrude into wakefulness where normally you wouldn’t see them. However, they come with an attentional tradeoff, where attention fails during the moments that you have this wave of fluid flow,” says Laura Lewis, the Athinoula A. Martinos Associate Professor of Electrical Engineering and Computer Science, a member of MIT’s Institute for Medical Engineering and Science and the Research Laboratory of Electronics, and an associate member of the Picower Institute for Learning and Memory.

Lewis is the senior author of the study, which appears today in Nature Neuroscience. MIT visiting graduate student Zinong Yang is the lead author of the paper.

Flushing the brain

Although sleep is a critical biological process, it’s not known exactly why it is so important. It appears to be essential for maintaining alertness, and it has been well-documented that sleep deprivation leads to impairments of attention and other cognitive functions.

During sleep, the cerebrospinal fluid that cushions the brain helps to remove waste that has built up during the day. In a 2019 study, Lewis and colleagues showed that CSF flow during sleep follows a rhythmic pattern in and out of the brain, and that these flows are linked to changes in brain waves during sleep.

That finding led Lewis to wonder what might happen to CSF flow after sleep deprivation. To explore that question, she and her colleagues recruited 26 volunteers who were tested twice — once following a night of sleep deprivation in the lab, and once when they were well-rested.

In the morning, the researchers monitored several different measures of brain and body function as the participants performed a task that is commonly used to evaluate the effects of sleep deprivation.

During the task, each participant wore an electroencephalogram (EEG) cap that could record brain waves while they were also in a functional magnetic resonance imaging (fMRI) scanner. The researchers used a modified version of fMRI that allowed them to measure not only blood oxygenation in the brain, but also the flow of CSF in and out of the brain. They also measured each subject’s heart rate, breathing rate, and pupil diameter.

The participants performed two attentional tasks while in the fMRI scanner, one visual and one auditory. For the visual task, they had to look at a screen that had a fixed cross. At random intervals, the cross would turn into a square, and the participants were told to press a button whenever they saw this happen. For the auditory task, they would hear a beep instead of seeing a visual transformation.

Sleep-deprived participants performed much worse than well-rested participants on these tasks, as expected. Their response times were slower, and for some of the stimuli, the participants never registered the change at all.

During these momentary lapses of attention, the researchers identified several physiological changes that occurred at the same time. Most significantly, they found a flux of CSF out of the brain just as those lapses occurred. After each lapse, CSF flowed back into the brain.

“The results are suggesting that at the moment that attention fails, this fluid is actually being expelled outward away from the brain. And when attention recovers, it’s drawn back in,” Lewis says.

The researchers hypothesise that when the brain is sleep-deprived, it begins to compensate for the loss of the cleansing that normally occurs during sleep, even though these pulses of CSF flow come with the cost of attention loss.

“One way to think about those events is because your brain is so in need of sleep, it tries its best to enter into a sleep-like state to restore some cognitive functions,” Yang says. “Your brain’s fluid system is trying to restore function by pushing the brain to iterate between high-attention and high-flow states.”

A unified circuit

The researchers also found several other physiological events linked to attentional lapses, including decreases in breathing and heart rate, along with constriction of the pupils. They found that pupil constriction began about 12 seconds before CSF flowed out of the brain, and pupils dilated again after the attentional lapse.

“What’s interesting is it seems like this isn’t just a phenomenon in the brain, it’s also a body-wide event. It suggests that there’s a tight coordination of these systems, where when your attention fails, you might feel it perceptually and psychologically, but it’s also reflecting an event that’s happening throughout the brain and body,” Lewis says.

This close linkage between disparate events may indicate that there is a single circuit that controls both attention and bodily functions such as fluid flow, heart rate, and arousal, according to the researchers.

“These results suggest to us that there’s a unified circuit that’s governing both what we think of as very high-level functions of the brain — our attention, our ability to perceive and respond to the world — and then also really basic fundamental physiological processes like fluid dynamics of the brain, brain-wide blood flow, and blood vessel constriction,” Lewis says.

In this study, the researchers did not explore what circuit might be controlling this switching, but one good candidate, they say, is the noradrenergic system. Recent research has shown that this system, which regulates many cognitive and bodily functions through the neurotransmitter norepinephrine, oscillates during normal sleep.

The research was funded by the National Institutes of Health, a National Defense Science and Engineering Graduate Research Fellowship, a NAWA Fellowship, a McKnight Scholar Award, a Sloan Fellowship, a Pew Biomedical Scholar Award, a One Mind Rising Star Award, and the Simons Collaboration on Plasticity in the Aging Brain.

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Source: MIT News

New Brain Imaging Technique Can Detect Early Frontotemporal Dementia

Photo by Anna Shvets

A new international study led by researchers at Karolinska Institutet demonstrates that it is possible to detect subtle changes in the brain and identify early signs of hereditary frontotemporal dementia using advanced brain imaging techniques. The study has recently been published in Molecular Psychiatry.

Frontotemporal dementia, or FTD, is a neurodegenerative disease that often affects people in middle age and is a common cause of dementia before the age of 65. The disease is particularly difficult to diagnose in its early stages, as the earliest symptoms are behavioural changes and may resemble primary psychiatric disease and symptoms later on can resemble conditions such as Alzheimer’s disease and Parkinson’s disease. In about a third of cases, frontotemporal dementia is hereditary, making families with known mutations an important resource for research.

New type of MRI technique

In the current study, researchers from Karolinska Institutet, together with an international research network, examined the brain’s microstructure in over 700 individuals – both carriers of FTD mutations and control subjects. The researchers used a new type of MRI technique that measures how water molecules spread within the grey matter of the brain, where greater diffusion indicates microstructural damage to brain tissue. In this way, the technique can reveal early damage in the cerebral cortex before the brain begins to shrink, known as brain atrophy, or cognitive problems arise.

The results revealed that the new method is more sensitive than the established imaging technique that measures the thickness of the cerebral cortex. Among individuals with a mutation in the C9orf72 gene, the researchers could detect changes in the brain even before any clinical symptoms appeared. For mutations in the MAPT gene, changes were observed at mild symptom stages, whereas for carriers of GRN mutations, alterations emerged only at more advanced stages.

Identifying individuals at risk

“Our findings show that changes in the brain’s microstructure can be detected before visible brain atrophy, and these changes are closely linked to how the disease develops,” explains corresponding author Elena Rodriguez-Vieitez, researcher at the Department of Neurobiology, Care Sciences and Society, Karolinska Institutet.

“This could be valuable for identifying individuals at risk and for evaluating new therapies in clinical trials.”

The researchers also followed the participants over time and showed that a greater spread of water molecules in brain tissue at the start of the study was linked to a faster decline in behaviour and cognitive ability. This was true for all three mutation types.

“Our results suggest that measurements of the brain’s microstructure could become an important tool for identifying individuals at risk of frontotemporal dementia and for monitoring disease progression in clinical trials,” says Caroline Graff, professor at the same department and last author of the study.

Source: Karolinska Institutet

New Laser System Measures Scalp and Brain Blood Flow

This optical measurement could offer an affordable and scalable way to diagnose stroke, brain injury and other conditions

Experimental arrangement of the SCOS system for measuring cerebral blood dynamics during superficial temporal artery (STA) occlusion. (a) 3D visualization of the SCOS device positioned over the temple region and the occlusion site near the ear bone. (b) Top and lateral views of the device, illustrating different detecting channels for sensing the scalp, skull, and brain layers. Credit: Liu et al., APL Bioengineering, 2025

Measuring blood flow in the brain is critical for responding to a range of neurological problems, including stroke, traumatic brain injury (TBI) and vascular dementia. But existing techniques, including magnetic resonance imaging and computed tomography, are expensive and therefore not widely available.

Researchers from the USC Neurorestoration Center and the California Institute of Technology (Caltech) have built a simple, noninvasive alternative. The device takes a technique currently used in animal studies known as speckle contrast optical spectroscopy (SCOS) and adapts it for potential clinical use in humans. It works by capturing images of scattered laser light with an affordable, high-resolution camera.

“It’s really that simple. Tiny blood cells pass through a laser beam, and the way the light scatters allows us to measure blood flow and volume in the brain,” said Charles Liu, MD, PhD, professor of clinical neurological surgery, urology and surgery at the Keck School of Medicine of USC, director of the USC Neurorestoration Center and co-senior author of the new research.

The device has already been tested with humans in small proof of concept studies demonstrating the tool’s utility for assessing stroke risk and detecting brain injury. In the current study, published in APL Bioengineering, Liu and his team sought to confirm that SCOS is truly measuring blood flow in the brain, rather than in the scalp, which also contains many blood vessels. The question has long plagued researchers who use light-based technology to visualize the brain.

Liu’s team took an innovative approach: By temporarily blocking blood flow to the scalp, they confirmed that SCOS readings were indeed measuring signals from blood vessels in the brain. Readings from 20 participants showed that positioning the detector at least 2.3cm away from the laser source provided the clearest measurement of brain blood flow. The study, funded in part by the National Institutes of Health, the Alfred Mann Foundation and the USC Neurorestoration Center, was just published in the journal APL Bioengineering.

“For the first time in humans, this experimental evidence shows that a laser speckle optical device can probe beyond the scalp layers to access cerebral signals,” said Simon Mahler, PhD, who is now an assistant professor in the Department of Biomedical Engineering at the Stevens Institute of Technology and one of the paper’s coauthors. “This is an important step toward using SCOS to non-invasively measure blood flow in the brain.”

Tracking brain blood flow

For years, researchers measuring brain signals with light-based technology, such as lasers and fibre optics, have used statistical simulations to estimate which signals originate in the brain versus the scalp. The USC Neurorestoration Center team found a direct way to test the difference, thanks to a collaboration between surgeons, engineers and neurologists.

“I perform surgeries to increase blood flow in the brain, and many of these involve temporarily stopping blood flow in the scalp,” said Jonathan Russin, MD, now professor and chief of neurosurgery at the University of Vermont, who continues to collaborate with the USC Neurorestoration Center. “That gave us a simple way to test the technology – by creating a change that affected only the scalp’s circulation while leaving the brain’s blood flow untouched.”

In 20 participants, the researchers temporarily stopped blood flow to the scalp, then collected a series of SCOS readings. By gradually moving the detector further from the head, they captured signals reaching progressively deeper towards the brain. They found that positioning the detector 2.3cm from the head allowed them to measure brain blood flow while minimising interference from the scalp.

The findings confirm the utility of SCOS for non-invasively detecting brain blood flow and provide important guidance for other researchers working with light-based technology, Liu said.

Bringing SCOS to patients

Beyond advancing research, the study helps confirm the clinical potential of SCOS for detecting and responding to stroke, brain injury and dementia. Because all of the team’s research has been done with humans, the tool is poised for rapid translation from the lab to the clinic.

“We look directly at humans in essentially the same way the tool will be applied, so there’s nothing lost in translation,” Liu said. “We are never more than one step away from the problem we’re trying to solve.”

The technique is already being used by some of the team’s collaborators to help diagnose stroke and TBI. Next, the researchers will continue to refine the technology and software, working to improve the resolution of images and the quality of data extracted from readings.

“With the knowledge that we’re now measuring exactly what we intend to measure, we’re also going to expand our testing of this technique with patients in clinical settings,” Liu said.

Source: Keck School of Medicine of USC

Fibroblasts Have Hidden Powers That Could Heal Brain Injuries

A mouse brain cortex seven days after a stroke that caused injury. Fibroblasts (green) have created collagen (pink) to form a protective scar layer around the injury. All images by Molofsky Lab, UCSF

Healing from any injury involves a delicate balance between scarring and inflammation – two processes that can wreak havoc as well as make repairs.

When the injury is to the brain, the balance is that much more important, yet scientists know almost nothing of how this process works.

Now, a study from UC San Francisco spotlights how a cell type called a fibroblast, that plays a healing role in other parts of the body also performs a similar function in the brain. The discovery is a step toward finding new ways to treat brain injuries, which are the nation’s leading cause of death and disability and for which there aren’t any drugs that can intervene.

Fibroblasts were only identified in the brain in the last decade. They reside mostly in the meninges, a set of protective membranes that surround the brain and spinal cord. Until now, scientists thought they mostly served to maintain the structure of the meninges and its network of blood vessels.

Ari Molofsky, MD, PhD, a professor of laboratory medicine, suspected the fibroblasts might be doing much more than that. He and Tom Arnold, MD, a professor of paediatrics, discovered that when the brain is injured – whether from a blow or a stroke – fibroblasts navigate from the meninges and surround the injured tissue where they create a protective barrier, or scar.

The same injury 14 days after the stroke. The scar now surrounds the whole injury, which is less swollen. Some fibroblasts have returned to their usual location in the meninges. Those that remain have switched roles and are now recruiting immune cells to moderate inflammation.

Then, about a week later, after the scar has formed, the fibroblasts adopt new roles. Some recruit immune cells that are required for healing; others ensure that the immune response doesn’t cause too much inflammation; and still others return to the meninges. Understanding these distinct stages could spur new interventions to help people with serious injuries.

Various views of a mouse brain cortex seven days after a stroke that caused injury. Green dots show fibroblast cells; pink areas show collagen produced by the fibroblasts to create a protective scar layer; and blue shows blood vessels with fibroblasts.

“Our study reveals opportunities to enhance the natural repair process,” said Molofsky, the senior author of the study, which appeared in Nature. “The goal is to give someone who’s experienced a traumatic brain injury or stroke the best outcome possible, based on the stage of healing they’re in.”

Therapies currently in clinical trials for lung and liver fibrosis target a molecule that prompts fibroblasts to create scarring. This suggests that other similar drugs could enhance healing in the early stages of a brain injury.

Molofsky’s study also offers an ideal venue for scientists to learn how fibroblasts are doing their work elsewhere in the body. Being largely devoid of immune cells, the brain offers a much clearer view than other organs like the lungs or liver, where immune cells may be too crowded around fibroblasts to see what they are doing.

“There’s a lot of potential here,” Molofsky said. “These overlooked cells seem adept at solving the common challenge of balancing healing and inflammation.”

Source: University of California – San Francisco

Faster MRI Scans Offer New Hope for Dementia Diagnosis

Photo by Mart Production on Pexels

The time to carry out diagnostic MRI scans for dementia can be cut to one third of their standard length, according to a new study led by UCL researchers.

The findings, published in Alzheimer’s & Dementia, have been described as a step towards ending ‘the postcode lottery in dementia diagnosis’. Shorter scans would be easier and more comfortable for patients and also enable more people to be scanned at a lower cost. The team behind the study say this could at least double the number of dementia scans able to be done in one day.

Senior author Professor Nick Fox, Director of the UCL Dementia Research Centre at the UCL Queen Square Institute of Neurology, said: “As more treatments that can slow or change the course of dementia are being developed, it’s important to make sure MRI scans are available to everyone. This is because people living with dementia often need an MRI scan as part of their diagnosis before they can access these treatments.

“To help make this possible, our team carried out the first study looking at how new imaging techniques – called parallel imaging – could speed up MRI scans in clinics. Their goal is to move closer to a future where every person with dementia can get a diagnosis through a scan.”

MRI scans often play a key role in an accurate dementia diagnosis, including ruling out other causes of symptoms and assisting in diagnosing the type of dementia. Emerging disease-modifying treatments such as lecanemab and donanemab also require an MRI scan before starting treatment and for safety monitoring during the course of treatment. Reducing the cost of scanning would contribute to lowering the total cost of delivering for such treatments. 

The ADMIRA study (Accelerated Magnetic Resonance Imaging for Alzheimer’s disease), part funded by Alzheimer’s Society’s Heather Corrie Impact Fund, aimed to understand the reliability of fast MRI scans compared to standard-of-care clinical scans. The neurologists on the study were joined by co-authors from the UCL Hawkes Institute and the UCL Advanced Research Computing Centre in the faculty of Engineering.

The research team scanned 92 people in an outpatient setting where an MRI brain scan was planned as part of their routine clinical assessment. The accelerated scans were carried out and enhanced to increase the quality of the image using new scanning methods. Three neuroradiologists examined these scans, and weren’t aware if they were looking at fast or standard-of-care scans.

Co-author Professor Geoff Parker (UCL Hawkes Institute and UCL Medical Physics and Biomedical Engineering) said: “Our research has taken advantage of recent breakthroughs in scanner technology. Our task was to work out just how fast we could scan while maintaining image quality good enough for diagnosis.”

The team found that the quicker scans reduced time in the scanner by 63% and they were as reliable as the standard-of-care scans for diagnosis and visual ratings.

First author Dr Miguel Rosa-Grilo (UCL Queen Square Institute of Neurology) said: “We were confident that the new scan would prove non-inferior to the standard scan, given the high image quality – but it was remarkable how well it performed.”

Richard Oakley, Associate Director of Research and Innovation at Alzheimer’s Society, said: “Dementia is the UK’s biggest killer, but one in three people living with the condition haven’t had a diagnosis. An early and accurate diagnosis isn’t just a label, it’s the first step to getting vital care, support and treatment.

“While MRIs aren’t the only way to diagnosis dementia, very few people with concerns about their cognitive health are offered one as part of the diagnosis process, mainly because they are expensive and not widely available. These faster MRIs, which take less than half the time of standard scans, could help end this postcode lottery in dementia diagnosis, cut costs and potentially give more people access to them.

“MRI scans can be an uncomfortable and daunting experience for patients, so anything we can do to make it an easier process is really positive.

“So far, this shortened MRI scan has been tested at one specialist centre with one type of MRI scanner, so more research is needed to make sure this works across different types of scanners and a diverse range of people. We’re hugely encouraged by this progress and eager to see how it continues.”

The team will now build on their early results by making sure the approach works across different types of MRI machines, so it can benefit as many hospitals and clinics as possible.

Source: University College London

Quitting Smoking Late in Life May Still Slow Cognitive Decline

Photo by Sara Kurfess on Unsplash

Quitting smoking in middle age or later is linked to slower age-related cognitive decline over the long term, according to a new study by UCL researchers.

The study, published in The Lancet Healthy Longevity, looked at data from 9436 people aged 40 or over (average age: 58) in 12 countries, comparing cognitive test results among people who quit smoking with those of a matched control group who kept smoking.

The research team found that the cognitive scores of those who had quit smoking declined significantly less than their smoking counterparts in the six years after they quit. For verbal fluency, the rate of decline roughly halved, while for memory it slowed by 20%.

Since slower cognitive decline is related to reduced dementia risk, their findings add to a growing body of evidence suggesting quitting smoking might be a preventative strategy for the disease. Still, more research is needed to confirm this.

Lead author Dr Mikaela Bloomberg (UCL Institute of Epidemiology & Health Care) said: “Our study suggests that quitting smoking may help people to maintain better cognitive health over the long term even when we are in our 50s or older when we quit.

“We already know that quitting smoking, even later in life, is often followed by improvements in physical health and well-being. It seems that, for our cognitive health too, it is never too late to quit.

“This finding is especially important because middle-aged and older smokers are less likely to try to quit than younger groups, yet they disproportionately experience the harms of smoking. Evidence that quitting may support cognitive health could offer new compelling motivation for this group to try and quit smoking.

“Also, as policymakers wrestle with the challenges of an ageing population, these findings provide another reason to invest in tobacco control.”

Smoking is thought to harm brain health in part because it affects cardiovascular health – smoking causes damage to blood vessels that supply oxygen to the brain. Smoking is also thought to affect cognitive health by causing chronic inflammation and directly damaging brain cells through oxidative stress (due to the creation of unstable molecules called free radicals).

Co-author Professor Andrew Steptoe (UCL Institute of Epidemiology & Health Care) said: “Slower cognitive decline is linked to lower dementia risk. These findings add to evidence suggesting that quitting smoking might be a preventative strategy for the disease. However, further research will be needed that specifically examines dementia to confirm this.”

Previous studies, the researchers noted, had found a short-term improvement in cognitive function after people stopped smoking. But whether this improvement was sustained over the longer term – particularly when people quit smoking later in life – was not known.

To answer this question the research team looked at data from three ongoing studies* where a nationally representative group of participants answered survey questions every two years. The studies covered England, the US, and 10 other European countries.

More than 4,700 participants who quit smoking were compared with an equal number of people who carried on smoking. The two groups were matched in terms of their initial cognitive scores and other factors such as age, sex, education level, and country of birth.

The research team found that the two groups’ scores in memory and verbal fluency tests declined at a similar rate in the six years prior to participants of one group quitting smoking. These trajectories then diverged in the six years following smoking cessation.

For the smokers who quit, the rate of decline was about 20% slower for memory and 50% slower for verbal fluency. In practical terms, this meant that with each year of ageing, people who quit experienced three to four months less memory decline and six months less fluency decline than those who continued smoking.

This was an observational analysis, so unmeasured differences between smokers who quit and continuing smokers could remain; while the trends before quitting were similar, the study cannot prove cause and effect.

However, the research team noted their findings were consistent with earlier studies showing that adults aged over 65 who quit smoking during early- or mid-life have comparable cognitive scores to never smokers, and that former and never smokers have a similar risk of dementia a decade or longer after quitting.

*The longitudinal studies were the English Longitudinal Study of Ageing (ELSA), the Survey of Health, Ageing and Retirement in Europe (SHARE), and the Health and Retirement Study (HRS).

Source: University College London

Link Between Calcium Supplements and Dementia Debunked by New Research

New research has found no evidence that calcium monotherapy increases the long-term risk for dementia.

Photo by cottonbro studio

New research from Edith Cowan University (ECU), Curtin University and the University of Western Australia has found no evidence that calcium monotherapy increases the long-term risk for dementia, helping to dispel previous concerns about its potential negative effects on brain health in older women.

This study, which leveraged outcomes from prior research that provided calcium supplements or a placebo to 1460 older women over a five-year period, found that the supplement did not increase the long-term risk of dementia.

“Calcium supplements are often recommended to prevent or manage osteoporosis,” said ECU PhD student Ms Negar Ghasemifard.

Around 20 per cent of women over the age of 70 are affected by osteoporosis and calcium supplementation is widely recommended as a preventative measure against fracture.

“Previous research has raised concerns around the impacts that calcium supplements could have on cognitive health, particularly dementia. Results from our study provides reassurance to patients and clinicians regarding the safety of calcium supplements in the context of dementia risk for older women,” Ms Ghasemifard said.

ECU Senior Research Fellow Dr Marc Sim noted that when the analysis was adjusted for supplement compliance, a range of lifestyle factors, including dietary calcium intake and genetic risk, the results remained unchanged.

“Previous research suggesting potential links between calcium supplement use and the risk for dementia was purely observational in nature. Our research, in comparison, consisted of a post-hoc analysis from a 5-year double-blind, placebo controlled randomised clinical trial on calcium supplements to prevent fracture. Whilst our study is still epidemiology, its design does reduce the likelihood of unmeasured confounding”

“Some 730 older women were given calcium supplements over five years, and a further 730 were given placebo. This study design offers more accurate data on dosage and duration, and we had a long follow-up period of 14.5 years, which strengthens our results,” Dr Sim said.

While these findings may alleviate concerns regarding calcium supplementation and all-cause dementia risk in older women, particularly after the age of 80 years, Professor Simon Laws, Director of ECU’s Centre for Precision Health, said further research was required.

“Whether this extrapolates to other demographics, such as men or even women commencing supplementation earlier in life, remains unknown. To confirm the current findings, particularly regarding brain health, and to address these population gaps, future clinical trials of calcium supplements, with or without vitamin D, would need to be undertaken. These should include specific and robust assessments of brain health as the primary outcome measures.”

Curtin University’s Professor Blossom Stephan, Director of the Dementia Centre of Excellence and a Dementia Australia Honorary Medical Advisor said the research highlighted a very important finding that provides reassurance to clinicians and patients about the long-term safety of calcium supplementation.

“Given calcium’s critical role in multiple physiological functions, including bone health, these results provide reassurance that long-term calcium supplementation did not increase dementia risk in older women,” she said.

Source: Edith Cowan University

New Research Shows that Cancer Can Damage the Myelin Sheath

Myelin sheath damage. Credit: Scientific Animations CC4.0

A new study, published in Nature, underscores the importance of investigating interactions between cancer and the nervous system – a field known as cancer neuroscience. The results suggest that targeting the signalling pathways involved can reverse this inflammation and improve treatment responses.

“These findings uncover novel mechanisms by which the immune system and nerves within the tumour microenvironment interact, revealing actionable targets that could transform the way we approach resistance to immunotherapy in patients with cancer,” said co-corresponding author Moran Amit, MD, PhD, professor of Head and Neck Surgery. “This marks a significant advance in our understanding of tumour-neuro-immune dynamics, highlighting the importance of investigating the interplay of cancer and neuroscience in meaningful ways that can directly impact clinical practice.”

Tumours can sometimes infiltrate the space around nerves and nervous system fibres that are in close proximity, a process known as perineural invasion, which leads to poor prognosis and treatment escalation in various cancer types. Yet little is known about how this invasion affects or interacts with the immune system.

The study, co-led by Amit, Neil Gross, MD, professor of Head and Neck Surgery, and Jing Wang, PhD, professor of Bioinformatics and Computational Biology, examined the role of perineural invasion and cancer-associated nerve injury in relation to the development of immunotherapy resistance commonly seen in patients with squamous cell carcinomamelanoma and stomach cancer.

Collaborating with the immunotherapy platform, part of the James P. Allison Institute, the team analysed trial samples using advanced genetic, bioinformatic and spatial techniques. The researchers revealed that cancer cells break down the protective myelin sheaths that cover nerve fibres, and that the injured nerves promote their own healing and regeneration through an inflammatory response.

Unfortunately, this inflammatory response gets caught in a chronic feedback loop as tumors continue to grow, repeatedly damaging nerves which then recruit and exhaust the immune system, ushering in an immunosuppressive tumor microenvironment that leads to treatment resistance. The study showed that targeting the cancer-induced nerve injury pathway at different points can reverse this resistance and improve treatment response.

Importantly, the authors point out that this reduced neuronal health is directly associated with perineural invasion and cancer-induced nerve injury, rather than a general cancer-induced effect, highlighting the importance of studying cancer-nerve interactions that can potentially contribute to cancer progression.

As part of MD Anderson’s Cancer Neuroscience Program, researchers are investigating scientific themes – such as neurobiology, tumours of the brain and spine, neurotoxicities and neurobehavioural health – to understand how the nervous system and cancer interact and how this affects patients throughout their cancer journey.

Source: University of Texas MD Anderson Cancer Center

Disconnected Hemisphere in Epilepsy Patients Lingers in a Sleep-like State

Surgically isolated, seizure-causing neural tissue shows evidence suggestive of absent or reduced awareness

This metaphorical illustration reimagines a hidden view of the brain of a patient with hemispherotomy. In the absence of subcortical activating inputs, the disconnected cortex defaults to a sleep-like state, marked by slow-wave EEG activity and evocatively represented here as night. By contrast, the intact hemisphere, still integrated with subcortical structures, sustains wakefulness and a bright inner world capable of environmental interaction, reflected in faster EEG rhythms. The image underscores the divergence of cortical states after hemispheric isolation.
Image credit: Michele A. Colombo (CC-BY 4.0)

Sleep-like slow-wave patterns persist for years in surgically disconnected neural tissue of awake epilepsy patients, according to a study published October 16th in the open-access journal PLOS Biology by Marcello Massimini from Universita degli Studi di Milano, Italy, and colleagues. The presence of slow waves in the isolated hemisphere impairs consciousness, however, whether they serve any functional or plastic role, remains unclear.

Hemispherotomy is a surgical procedure used to treat severe cases of epilepsy in children. The goal of this procedure is to achieve maximal disconnection of the diseased neural tissue, potentially encompassing an entire hemisphere, from the rest of the brain to prevent the spread of seizures. The disconnected cortex – the outer layer of neural tissue in the brain – is not surgically removed and has a preserved vascular supply. Because it is isolated from sensory and motor pathways, it cannot be evaluated behaviourally, leaving open the question of whether it retains internal states consistent with some form of awareness. More broadly, the activity patterns that large portions of the disconnected cortex can sustain in awake humans remain poorly understood.

To address these questions, Massimini and colleagues used electroencephalography (EEG) to measure activity in the isolated cortex during wakefulness before and up to three years after surgery in 10 paediatric patients, focusing on non-epileptic background activity. Following surgery, prominent slow waves appeared over the disconnected cortex. This is novel evidence that this pattern can last for months and years after complete cortical disconnection. The persistence of slow waves raises the question of whether they play any functional role or merely reflect a regression to a default mode of cortical activity.

The pronounced broad-band EEG slowing resembled patterns observed in conditions such as deep non-rapid eye movement (NREM) sleep, general anaesthesia, and the vegetative state. The findings indicate absent or reduced likelihood of dream-like experiences in the isolated cortex. Overall, the EEG evidence is compatible with a state of absent or reduced awareness.

According to the authors, any inference about the presence or absence of consciousness, based solely on the brain’s physical properties such as prominent EEG slow waves, should be approached with caution, particularly in neural structures that are not behaviourally accessible. The slowing observed at the scalp level should be further characterised with intracranial recordings in cases in which clinical outcomes require postoperative invasive monitoring.

Michele A. Colombo says, “This is only the beginning of shedding light on the problem of consciousness in inaccessible systems. During the revision process, we were confronted with different perspectives, revealing the complexity of this problem.”

Marcello Massimini adds, “This pattern may provide clues to why sleep-like brain activity emerges in patients with brain lesions, and how it relates to their level of awareness.”

Anil K. Seth adds, “This has been an exciting and deeply satisfying scientific journey. It started years ago with philosophical discussions about the possibility of ‘islands of awareness’ in completely isolated neural systems, to, now, this wonderful collaboration which has shed important experimental light on this clinically important issue.”

Tim Bayne finally states, “The study of consciousness involves many puzzling cases in which it is unclear what to say about the possibility of subjective experience. As a philosopher, it’s been deeply rewarding to explore a new frontier in consciousness science with this wonderful team of scientists and clinicians.” 

Provided by PLOS

The Power of Touch: Skin-to-skin Contact Linked to Preemie Brain Growth

Photo by Hush Naidoo on Unsplash

Preterm infants born before 32 weeks who received more skin-to-skin contact while in the hospital showed stronger brain development in areas tied to emotion and stress regulation than babies who received less skin-to-skin care, according to a study published in Neurology®, the medical journal of the American Academy of Neurology. The study can only show an association and cannot establish causation.

“Skin-to-skin contact in preterm infants has been shown to have many benefits, with previous studies linking it to improved bonding, sleep, heart and lung function and growth, as well as reduced pain and stress,” said study author Katherine E. Travis, PhD, of Burke Neurological Institute in White Plains, New York. “Our findings in infants born very preterm suggest skin-to-skin care may also play a role in shaping early brain development, highlighting the potential importance of caregiving experiences during the earliest weeks of a preemie’s life.”

he study included 88 preterm infants with an average gestational age of 29 weeks who weighed an average of 2.65 pounds. The average stay in the hospital was two months. The goal was to find out whether skin-to-skin holding, also called kangaroo care, was linked to brain development in areas that help regulate emotions and stress. Researchers tracked skin-to-skin care with family members throughout each infant’s hospitalisation, including how long each session lasted and the total minutes per day. Families visited an average of once per day. When they provided skin-to-skin care, the average session was around 70 minutes with 73% of sessions provided by mothers. For the entire hospital stay, the average amount of skin-to-skin care per day was 24 minutes.

Each infant received a brain scan before going home from the hospital – around the time they would have reached full-term age of around 40 weeks. The brain scans measured how water moves through brain tissue. This movement helps reveal how white matter – the brain’s communication network – is developing. Researchers then compared the markers of white matter with the amount of time the preemies received skin-to-skin care per session and per day.

For skin-to-skin duration per session, researchers found longer sessions were linked to higher mean diffusivity – how freely water moves through the brain – in two key brain regions: the cingulum, which supports attention and emotion regulation; and the anterior thalamic radiations, which connects areas involved in emotional processing and memory.

Longer sessions were also linked to lower fractional anisotropy – how water movement is influenced by developing cellular tissues – in the anterior thalamic radiations. For daily total minutes of skin-to-skin care, researchers found higher amounts were linked to higher mean diffusivity in the anterior thalamic radiations. They were also linked to lower fractional anisotropy in the anterior thalamic radiations. These associations remained significant even after researchers accounted for factors that could influence brain development, including gestational age at birth, age at time of scan, socioeconomic status and how often family visited.

“Our findings add to growing evidence that white matter development is sensitive to a preterm infant’s experience while in the hospital,” said Travis. “Skin-to-skin care not only provides preterm infants with family connections through bonding, it may also be encouraging new connections within the brain itself, improving a baby’s brain health overall.”

A limitation of the study is that it was conducted at a single hospital and researchers reviewed existing medical records. The authors note that future research should explore how early caregiving experiences – like skin-to-skin care – might shape brain development and support later behavioural outcomes as preterm infants grow.

Source: American Academy of Neurology