Category: Neurology

Babies are Born with a Sense of Rhythm, Study Suggests

Newborns listening to Bach music predicted rhythm, but not melody, according to their brain waves  

Human newborns can predict rhythmic structure from music, while they are not as good at expecting melodic changes. Image credit: Diego Perez-Lopez, PLOS, CC-BY 4.0

Babies are born with the ability to predict rhythm, according to a study published February 5th in the open-access journal PLOS Biology by ​​Roberta Bianco from the Italian Institute of Technology, and colleagues.

It’s anticipating a beat drop, key change or chorus in a song you’ve never heard. Across all cultures, humans can inherently anticipate rhythm and melody. But are babies born with these behaviours, or are they learned? Research shows that by approximately 35 weeks of gestation, foetuses begin to respond to music with changes in heart rate and body movements. However, newborns’ ability to anticipate rhythm and melody is not fully understood.

To understand babies’ musical aptitudes, researchers played J.S. Bach’s piano compositions for an audience of 49 sleeping newborns. Musical stylings included 10 original melodies and four shuffled songs with scrambled melodies and pitches. While the babies listened, the researchers used electroencephalography – electrodes placed on the babies’ heads – to measure their brainwaves. When the babies’ brain waves showed signs of surprise, it meant they expected the song to go one way, but it went another.

The newborns tended to show neural signs of surprise when the rhythm unexpectedly changed; in other words, the miniature maestros had generated musical expectations based on rhythm. Previously, this result had been observed in non-human primates. The researchers found no evidence that the newborns tracked melody or were surprised by unexpected melodic changes, a skill that comes at an unknown exact point later in development.

According to the authors, understanding how humans become aware of rhythm can help biologists understand how our auditory systems develop. Future studies can investigate how exposure to music during gestation affects acquisition of rhythm and melody.

The authors add, “Are newborns ready for Bach? Newborns come into the world already tuned in to rhythm. Our latest research shows that even our tiniest 2-day old listeners can anticipate rhythmic patterns, revealing that some key elements of musical perception are wired from birth. But there’s a twist: melodic expectations – our ability to predict the flow of a tune – don’t seem to be present yet. This suggests that melody isn’t innate but gradually learned through exposure. In other words, rhythm may be part of our biological toolkit, while melody is something we grow into.”

Provided by PLOS

Physical Pressure on the Brain Triggers Neurons’ Self-destruction Programming

Gliobastoma (astrocytoma) WHO grade IV – MRI sagittal view, post contrast. 15 year old boy. Credit: Christaras A.

The brain and spinal cord is made up of billions of neurons connected by synapses and managed and modified by glial cells. When neurons die, this communication network is disrupted and since this loss is irreversible, neuron death causes sensory loss, motor impairment and cognitive decline.

An interdisciplinary team of researchers from the University of Notre Dame is investigating the mechanisms of neuron death caused by chronic compression – such as the pressure exerted by a brain tumour – to better understand how to prevent neuron loss.

Published in the Proceedings of the National Academy of Sciences, their study found that chronic compression triggers neuron death by a variety of mechanisms, both directly and indirectly. The research is helping lay the groundwork for identifying therapies to prevent indirect neuron death.

“The impetus for this project was to figure out those underlying mechanisms. In cancer research, most researchers are focused on the tumour itself, but in the meantime, while the tumour is sitting there and growing, it’s damaging the organ that it’s living in,” said Meenal Datta, the Jane Scoelch DeFlorio Collegiate Professor of Aerospace and Mechanical Engineering at Notre Dame and co-lead author of the study. “We fully believe that these growth-induced mechanical forces of the tumor as it expands is part of the reason we see damage in the brain.”

As an engineer who leads the TIME Lab, Datta studies the mechanics of tumors and the microenvironment, specifically for glioblastoma, an incurable brain cancer. She had found in prior work that tumors damage the surrounding brain. But to understand the mechanisms by which tumors kill neurons from compression alone, Datta needed a “hardcore neuroscientist.”

Neurons captured on screen for research experiment.
Imaging of neurons from an experiment with the control group neurons on the left and the neurons impact by chronic compression on the right. (Provided by the Patzke lab.)

That neuroscientist is Christopher Patzke, the John M. and Mary Jo Boler Assistant Professor in the Department of Biological Sciences at Notre Dame and co-lead author of the study. Patzke utilises induced pluripotent stem cells (iPSCs), which are either obtained from external sources or generated directly in his lab. These cells function like embryonic stem cells and can be differentiated or changed in the lab into any cell type in the body, including neurons.

For this study, iPSCs were used to create neural cells and develop a model system of neurons and glial cells that behave as a neuronal network would in the brain. Researchers grew the cells and then applied pressure to the system to mimic the chronic compression of a glioblastoma tumour.

After compressing the cells, graduate students Maksym Zarodniuk and Anna Wenninger, from Datta and Patzke’s labs respectively, compared how many neurons and glial cells died versus lived.

“For the neurons that are still alive, many of them have this programmed self-destruction signaling activated,” Patzke said. “We wanted to understand which molecular pathway was responsible for this; is there a way to save neurons from going down the drain to this cell death mechanism?”

By sequencing and analysing all messenger RNA from the living neuronal and glial cells, the researchers found an increase in HIF-1 molecules, signalling for stress adaptive genes to improve cell survival, which leads to inflammation in the brain. The compression also triggered AP-1 gene expression, a type of neuroinflammatory response.

Both neurological reactions are indicators that neuronal damage and death is underway.

An analysis of data from the Ivy Glioblastoma Atlas Project shows that glioblastoma patients also reflect these compressive stress patterns and gene expression changes as well as synaptic dysfunction in line with the experiment’s results. The researchers confirmed these results by mimicking force via a live compression system applied to preclinical models of brains.

Overall, the findings may help explain why glioblastoma patients experience cognitive impairments, motor deficits and elevated seizure risk. Additionally, the signalling pathways offer opportunities for researchers to explore as drug targets to reduce neuronal death.

“Our approach to this study was disease agnostic, so our research could potentially extend to other brain pathologies that affect mechanical forces in the brain such as traumatic brain injury,” Datta said. “I’m all in on mechanics. Whatever it is that you’re interested in when it comes to cancer, above your question of interest, mechanics is sitting there and many don’t even know they should be considering it.”

The mechanics of compression and its effect on neuron loss is key for future research.

“Understanding why neurons are so vulnerable and die upon compression is critical to prevent excessive sensory loss, motor impairment and cognitive decline,” Patzke said. “This is how we will help patients.”

Source: University of Notre Dame

Antidepressants not Linked to Serious Complications from TBI

Photo by Cottonbro on Pexels

Taking certain antidepressants at the time of a traumatic brain injury (TBI) is not associated with an increased risk of death, brain surgery or longer hospital stays, according to a study published on January 28, 2026, in Neurology®, the medical journal of the American Academy of Neurology.

For the study, researchers looked at serotonergic antidepressants, which treat anxiety and depression by increasing serotonin activity in the brain. These included selective serotonin reuptake inhibitors (SSRIs), serotonin and norepinephrine reuptake inhibitors (SNRIs) and tricyclic antidepressants (TCAs).

“Concerns have previously been raised that serotonergic antidepressants might increase the risk of bleeding in the brain or complicate early recovery after traumatic brain injury,” said study author Jussi P. Posti, MD, PhD, of the University of Turku in Finland. “However, our study found no evidence to support those concerns.” The study included 54 876 people in Finland who were 16 or older when hospitalised with a TBI. A total of 14% used serotonergic antidepressants at the time of the TBI.

Researchers reviewed national prescription records for preinjury antidepressant use and medical records to determine how many people died within a month, whether they needed emergency brain surgery, and how long they stayed in the hospital. A total of 4105 people died within a month. This included 7.6% of those taking antidepressants and 7.5% of people who did not. After adjusting for factors such as age, sex and other health conditions, researchers found people taking antidepressants before injury were no more likely to die within a month than those not taking them.

Antidepressant users were slightly less likely to require emergency brain surgery to relieve pressure or bleeding in the brain and prevent further damage. Of the total participants, 6.8% of the antidepressant users and 8.6% of those who did not use antidepressants needed emergency brain surgery. After adjustments, antidepressant users had an 11% lower risk. The amount of time in the hospital was the same for both groups.

“These findings provide reassurance for people who take antidepressants that antidepressant use does not appear to worsen early recovery after traumatic brain injury,” said Posti. “Future studies should examine whether these results hold true for long-term recovery and across different health care settings.”

A limitation of the study was that it was conducted only at hospitals and health care centres in Finland, so results may vary in other areas. The study was supported by the Finnish government, the Paulo Foundation, Paavo Nurmi Foundation, Research Council of Finland, Sigrid Jusélius Foundation and Finnish Foundation for Cardiovascular Research.

Source: American Academy of Neurology

The Brain Can Boost Vaccine Effectiveness, New Study Suggests

Photo by National Cancer Institute

New research reveals that activating the brain’s reward system through positive anticipation strengthens the immune response and increases antibody production

Can positive anticipation that activates the brain’s reward system strengthen the body’s immune defences? A new study by Tel Aviv University, the Technion, and Tel Aviv Medical Center (Ichilov), published in the prestigious journal Nature Medicine, provides the first evidence in humans that brain activity associated with the expectation of reward has a measurable effect on the body’s response to a specific vaccine.

Training the Brain’s Reward SystemThe study was conducted through a collaboration between two research groups: the laboratory of Prof Talma Hendler, from the School of Psychological Sciences and the Gray Faculty of Medical and Health Sciences; and the laboratory of Prof Asya Rolls from The George S. Wise Faculty of Life Sciences,.

Eighty-five healthy volunteers participated in the experiment. Some underwent special brain training using fMRI neurofeedback technology – a method that enables individuals to learn, in real time, to regulate activity in specific brain regions through reinforcing learning. The aim of the brain training was to increase activity in a key region of the brain’s reward system including the Ventral Tegmental Area (VTA), which is responsible for dopamine release in the context of mental activity related to the expectation of positive outcomes and motivation to obtain rewards. Participants were instructed to modulate their brain activity using various mental strategies (eg, thoughts, feelings, memories) while monitoring positive feedback about the strategy that was successful in regulating their brain.

From Brain Activation to Antibodies

Immediately after completing the brain training, all participants received a hepatitis B vaccine. The researchers then tracked the immune response through a series of blood tests, measuring levels of specific antibodies produced following the vaccination.

The results showed that participants who succeeded in significantly increasing activity in the brain’s reward region also demonstrated a greater increase in antibody levels after vaccination. The association was specific to the VTA and was not observed in other brain regions used for control purposes (such as the hippocampus), nor in other reward-system areas linked to different reward-related experiences such as pleasure and satisfaction. In other words, the effect was both anatomically and mentally specific.

The Role of Positive Anticipation

Furthermore, an in-depth analysis of the mental strategies participants used during training of the VTA (and not other regions) revealed that those who focused on positive anticipation, such as belief in a good outcome, or the expectation of something positive about to happen, were able to maintain higher VTA brain activity over time, which was also associated with a better immune response. In other words, the researchers identified a link between reward-system brain activity, a mental state of positive anticipation, and the body’s response to an immune challenge.

According to the research team, this is not “positive thinking” in the popular sense or a New Age slogan, but a measurable neurobiological mechanism – related, among other things, to the well-known placebo effect in medicine (a therapeutic response beyond a specific medical intervention). “We show that mental states have a clear brain signature, and that this signature can influence physiological systems such as the immune system,” explain the researchers.

While the study does not propose a substitute for vaccines or medical treatment, it opens the door to new, noninvasive approaches that may one day strengthen immune responses, improve the effectiveness of medical treatments, and even contribute to fields such as immunotherapy and the treatment of chronic immune pathologies. The researchers note that the study’s findings underscore a broader message: the mind–body connection is not merely a theoretical concept, but a real biological process that can be measured, trained, and potentially harnessed to promote better health.

Implications for Medicine and Health

The research team adds that the findings highlight the potential inherent in integrating neuroscience, psychology, and medicine. “Our study shows that the brain is not only a system that responds to the body’s state of health, but also an active player that influences it,” say the researchers. “The ability to consciously activate brain mechanisms associated with positive anticipation opens a new avenue for research and future treatments – as a complement to existing medicine, not as a replacement. In the future, it may be possible to develop simple, noninvasive tools to help strengthen immune responses and enhance the effectiveness of medical treatments by relying on the brain’s natural capacity to influence the body. However, it is important to emphasise that activation of the reward system and its effect on immune response vary between individuals. Therefore, this approach cannot replace existing medical treatments, but may well serve as an additional supportive component.”

Source: Tel Aviv University

New Neurosurgical Classification Reveals Pivotal Role of Glioma Volume Reduction

International team develops system for a standardised assessment of operative success in treating certain brain tumours

Photo by cottonbro studio

Low-grade brain tumours known as IDH-mutant gliomas CNS WHO grade 2 are life-threatening despite their slow growth. Neurosurgeons across the globe are faced with the question as to striking the correct balance between a “radical” tumour resection and avoiding further neurological damage. An international research team from the RANO working group involving Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and Uniklinikum Erlangen has developed a new classification that records the extent to which any residual tumour tissue influences the progression of the disease. The results were published in The Lancet Oncology.

As a rule, the initial treatment for an IDH-mutated glioma CNS WHO grade 2 is surgery. The aim is to remove as much of the tumour as possible without jeopardising important neurological functions. As the results of the operation only become apparent many years later, there has been a lack of clear data, which has led to a number of different approaches. “On the one hand, this is due to the fact that we must be very careful to weigh up the chances of potentially boosting a patient’s chance of survival against avoiding neurological deficits. On the other hand, there has been a lack of clear criteria for assessing the risk of surgery until now, meaning that recommendations for treatment range from taking as little tissue as possible for diagnostic purposes to removing as much tumour tissue as possible,” explains Prof Dr Oliver Schnell from Uniklinikum Erlangen.

New basis for assessing success of surgery

In order to standardise therapeutic decisions, the RANO working group has conducted a large international study and assessed the data of 1391 patients from 16 neuro-oncological specialist units.

Based on the comprehensive data collected, the new RANO classification categorises the extent of the surgery based on the volume of the tumour that remains visible in a special MRI sequence (T2-FLAIR) after the operation. “Until now, there was no common language available for describing surgical outcomes,” explains PD Dr Philipp Karschnia from Uniklinikum Erlangen. “The new classification provides clarity, as it is guided exclusively by the residual tumour tissue.”

Less residual tumor means longer survival

The analysis of the RANO working group shows: A low volume of residual tumor after the initial operation is one of the most important factors for the further progression of the disease. A positive effect was also demonstrated for removing as much of the tumor as possible in the case of oligodendrogliomas, that tend to have a more favourable progression and are highly sensitive to chemo and radiation therapy. “We were surprised to discover that even follow-up treatments such as chemotherapy or radiation therapy were not able to replace the influence of the operation,” admits PD Dr Karschnia.

Internationally verified and useful in a wide range of scenarios

The results were confirmed in an independent patient group at the University of California in San Francisco. The new classification supports surgeons in making more accurate decisions and paves the way for future studies: “The new RANO classification is a milestone that will make a significant impact on neuro-oncological research and care in the long term,” according to Prof Schnell.

The Response Assessment in Neuro-Oncology (RANO) Working Group is an international, multidisciplinary collaboration between experts from various disciplines who have been working together to develop standardised criteria for assessing brain tumours for more than a decade now. Experts involved in the study from Erlangen were Prof Dr Oliver Schnell and PD Dr Philipp Karschnia, who has been leading the surgical focus group of the RANO Working Group since 2024, Dr Nico Teske and Alfred Gramelt from the Department of Neurosurgery at Uniklinikum Erlangen.

Source: Friedrich–Alexander University Erlangen–Nurnberg

Electrotherapy may be a Promising New Glioblastoma Treatment

Photo by Anna Shvets

Electrotherapy using injectable nanoparticles delivered directly into the tumour could pave the way for new treatment options for glioblastoma, according to a new study from Lund University in Sweden.

Glioblastoma is the most common and most aggressive form of brain tumour among adults. Even with intensive treatment, the average survival period is 15 months. The tumour has a high genetic variation with multiple mutations, which often makes it resistant to radiation therapy, chemotherapy and many targeted drugs. The prognosis for glioblastoma has not improved over the past few decades despite extensive research.

Electrotherapy – a new treatment method

Electrotherapy offers another strategy to combat solid tumours. Using short, strong electric pulses (irreversible electroporation), non-reversible pores are created in the cancer cells leading to their death. The body’s immune system is simultaneously stimulated. The problem is that surgery is required to place the stiff metal electrodes that are necessary for the treatment. In sensitive tissue, in the brain for example, this often entails a very difficult procedure, which has led to strict criteria regarding which patients can be treated. Johan Bengzon is a researcher in glioblastoma and adjunct professor at Lund University, and consultant in neurosurgery at the Skåne University Hospital. He regularly treats patients with glioblastoma and is frustrated by the limited treatment options.

“The short distance between the hospital and the University in Lund facilitates cooperation and that’s why I contacted research colleagues to find out if injectable electrodes could be an alternative solution in electrotherapy,” says Johan Bengzon.

Said and done. The research team, with Amit Singh Yadav, Martin Hjort, and Roger Olsson at the helm, had previously used nanoparticles to form injectable and electrically conductive hydrogels to control brain signalling and heart contractions. It is aminimally invasive method in which the particles are injected using a thin syringe directly into the body. The particles break down after the treatment and thus do not need to be surgically removed. Perhaps the same technology could be used to destroy tumour cells in glioblastoma. 

“After surgical treatment, unfortunately the glioblastoma tumour often returns on the outer edge of the area operated on. By drop casting the nanoparticles into the tumour cavity after an operation, we could electrify the edges while the immune system is also activated. In animal models the procedure, due to this irreversible electroporation, led to tumours being wiped out within three days,” says Roger Olsson, professor of chemical biology and drug development at Lund University, who led the study. 

Promising results – but a long way to the patient

The prospects are good and the researchers are very hopeful for the future, even though there is a long way to go before it becomes a clinical reality. The challenge is now to test the method on larger tumours. 

“We have seen that the electrode is well received in the brain. We have not noted any problems relating to side effects and after 12 weeks the electrode disappeared by itself as it’s biodegradable. The technology combines direct tumour destruction with activation of the immune system and can be an important step towards more effective treatment of glioblastoma,” concludes Amit Singh Yadav, researcher at Lund University and first author of the study. 

Source: Lund University

New Neural Maps Challenge Traditional Descriptions of the Brain

AI image of neurons created by Gencraft

For more than a century, maps of the brain have been based on how brain tissue looks under the microscope. These anatomical maps divide the brain into regions according to structural variations in the tissue. But do these divisions really reflect how the brain works? A new study on mice from Karolinska Institutet, published in Nature Neuroscience, suggests that this is often not the case.

By describing the brain in terms of electrical activity of its neurons, the researchers have found a new way to understand the functional organisation of the prefrontal cortex, the brain region responsible for planning, decision-making, and other advanced cognitive functions. 

“Considering that deviations in prefrontal cortex function have been linked to virtually all psychiatric disorders, it is surprising how little is known about how this region actually works,” says Marie Carlén, Professor at the Department of Neuroscience at Karolinska Institutet.

Did not align with previous maps

Her research group recorded and analysed the activity of more than 24 000 neurons in awake mice and created the first activity-based maps of the prefrontal cortex. The maps of spontaneous and cognition-related neuron activity did not match the traditional, tissue-based maps.

“Our findings challenge the traditional way of defining brain regions and have major implications for understanding brain organisation overall,” says Marie Carlén.

The researchers found that the activity patterns of neurons reflected the hierarchy of information flow in the brain rather than the structure of the tissue. Neurons with slow, regular activity turned out to be characteristic of the prefrontal cortex, which sits at the top of this hierarchy. The same activity pattern also marked regions at the top of the prefrontal cortex’s own internal hierarchy. Slow, regular activity is thought to characterise the integration of information flows, a process that is central to cognitive functions such as planning and reasoning. 

Different neuronal activity patterns work together

Carlén and her colleagues discovered that neurons involved in decision-making were concentrated in regions high up in the prefrontal hierarchy. Surprisingly, these neurons were characterised by very fast activity patterns. 

“This suggests that cognitive processes rely on local collaboration between neurons whose activity patterns complement one another. Some neurons appear to specialise in integrating information streams, while others have high spontaneous activity that supports quick and flexible encoding of information, for instance, information needed to make a specific decision,” says Marie Carlén.”

Source: Karolinska Institutet

Full-fat Cheese Linked to a Reduced Dementia Risk

Photo by David Foodphototasty on Unsplash

Eating cheese and cream with a high fat content may be linked to a lower risk of developing dementia. This is shown by a new large-scale study from Lund University. The researchers analysed the dietary habits of more than 27 000 people and linked these to the occurrence of dementia over a follow-up period of up to 25 years.

The debate about low-fat diets has long shaped our health advice and influenced how we view food and health. For several decades, fear of saturated fat and its link to cardiovascular disease has dominated. The MIND diet1 is a diet developed with the aim of reducing the risk of dementia. The diet includes protective foods such as vegetables, nuts, fruits, berries, whole grains, and fish, while cheese is one of the foods that should be limited.

Emily Sonestedt, researcher in nutritional epidemiology at Lund University in Sweden, and her colleagues, therefore wanted to investigate whether there was any link between dairy products and dementia. They collected dietary data from 27,670 people using the Malmö Diet Cancer population study, in which the participants respond about their dietary and cooking habits. The average age at the start of the study was 58, and the participants were followed for an average of 25 years, during which time 3,208 people developed dementia. The dementia diagnoses were obtained from the Swedish patient registry. For cases diagnosed up to 2014, additional validation studies were conducted in which dementia specialists reviewed medical records, brain scans, and cognitive test results.

After adjusting for lifestyle factors such as physical activity, diet, smoking, and alcohol consumption, the researchers found that people who ate 50 grams of cheese (with more than 20 percent fat) daily had a 13 percent lower risk of developing dementia than those who ate less than 15 grams daily. 50 grams is equivalent to about five regular slices of cheese. In total, about a quarter of the participants ate more than 50 grams or more daily.

”When we went on to look at specific types of dementia, we found that there was a 29 percent lower risk of vascular dementia in people who ate more full-fat cheese. We also saw a lower risk of Alzheimer’s disease, but only among those who did not carry the APOE e4 gene variant—a genetic risk factor for Alzheimer’s disease.”

The researchers also investigated the link between high-fat cream (30-40 percent fat) and dementia. People who consumed 20 grams or more daily had a 16 percent lower risk of dementia than those who did not consume any at all. 

The results of the cheese studies support the link between vascular health and brain health.

”The updated dietary guidelines in Sweden from this year say that we can eat dairy products every day, preferably fermented varieties such as yogurt or kefir. Both we and other researchers have found in observational studies that fermented dairy products in particular may be linked to a slightly reduced risk of cardiovascular disease 2,” says Emily Sonestedt.

In previous studies3, the research team has seen links to vascular health, with cheese and fermented dairy products in particular protecting against cardiovascular disease. 

”Although higher-fat cheese and cream were associated with a reduced risk of dementia, other dairy products and low-fat alternatives did not show the same effect. Therefore, not all dairy products are equal when it comes to brain health. The few studies that have investigated this have found a correlation with cheese, so more research is needed to confirm our results and investigate whether certain high-fat dairy products really do provide some protection for the brain.”

Source: Lund University


  1. The MIND diet stands for Mediterranean–DASH Intervention for Neurodegenerative Delay – a combination of the Mediterranean diet and the DASH diet. DASH (Dietary Approaches to Stop Hypertension) is a diet developed primarily to lower high blood pressure and improve cardiovascular health.
  2. Milk and dairy products – a scoping review for Nordic Nutrition Recommendations 2023
  3. Previous publications: 
    High-fat and low-fat fermented milk and cheese intake, proteomic signatures, and risk of all-cause and cause-specific mortality
    High consumption of dairy products and risk of major adverse coronary events and stroke in a Swedish population
    Dairy products and its association with incidence of cardiovascular disease: the Malmö diet and cancer cohort
    Dairy Consumption, Lactase Persistence, and Mortality Risk in a Cohort From Southern Sweden

Switching Memories On and Off with Epigenetics

Photo by Laura Louise Grimsley on Unsplash

Our experiences leave traces in the brain, stored in small groups of cells called “engrams”. Engrams are thought to hold the information of a memory and are reactivated when we remember, which makes them very interesting to research on memory and age- or trauma-related memory loss.

At the same time, scientists know that the biology of learning is accompanied by epigenetic changes, which refers to the ways the cell regulates genes by adding chemical “post-it notes” on DNA.

But the question of whether the epigenetic state of a single gene in turn can cause a memory to change has thus far remained unanswered.

A team led by Professor Johannes Gräff at EPFL’s Laboratory of Neuroepigenetics combined CRISPR-based gene control with a technique that tags engram cells in mice. They focused on Arc, a gene that helps neurons adjust their connections to other neurons. By targeting the control region of Arc, the team asked whether flipping its epigenetic “switch” could directly change memory. They published their findings in Nature Genetics.

An “epigenetic switch”

The researchers developed specialised, CRISPR-based tools that could either dial down or boost Arc activity in memory neurons. Some, like the KRAB-MeCP2 tool, were designed to switch off gene activity by adding repressive marks that make the DNA less accessible, while others opened the DNA and turned the gene on. These tools were essentially an “epigenetic switch” for the Arc gene.

They then used harmless viruses to deliver these tools directly into the hippocampus of mice, a brain region central for storing and retrieving memory. The mice were then trained to link a specific place with a mild foot shock. By changing the epigenetic state of Arc in the neurons, the scientists could see whether the animals remembered the shock or not. They also added a “safety switch” that could undo the editing and reset the memory state.

The study showed that epigenetically silencing Arc in engram cells made the mice not learn, while boosting it made their memory stronger. These changes could be reversed in the same animal, showing that this epigenetic “switch” can dial memory expression up or down. Even memories that were already several days old, which are usually hard to change, could be modified. On the molecular level, the editing caused changes in gene activity and DNA packaging that matched the behavioural effects.

Controlling memory expression

The study is the first direct demonstration that changing the epigenetic state in memory cells is necessary and sufficient to control memory expression. It points to new ways of exploring how memories are stored and altered, which could eventually also be relevant in humans.

In the future, similar approaches could help researchers better understand conditions where memory processing goes awry, such as traumatic memories in PTSD, drug-related memories in addiction, or the memory problems that appear in neurodegenerative diseases.

Source: EPFL

Microbes May Hold the Key to the Brain’s Evolution

First-of-its-kind study offers evidence that microbes from different primate species influence physiology in ways linked to brain size and function

Source: Pixabay

Humans have the largest relative brain size of any primate, but little is known about how mammals with larger brains evolved to meet the intense energy demands required to support brain growth and maintenance.

A new study from Northwestern University provides the first empirical data showing the direct role the gut microbiome plays in shaping differences in the way the brain functions across different primate species.

“Our study shows that microbes are acting on traits that are relevant to our understanding of evolution, and particularly the evolution of human brains,” said Katie Amato, associate professor of biological anthropology and principal investigator of the study, which was published in PNAS

The study builds upon previous findings from Amato’s lab that showed the microbes of larger-brained primates, when introduced in host mice,  produced more metabolic energy in the microbiome of the host – a prerequisite for larger brains, which are energetically costly to develop and function. This time, the researchers wanted to look at the brain itself to see if the microbes from different primates with different relative brain sizes would change how the brains of host mice functioned. 

What they found

In a controlled lab experiment, the researchers implanted gut microbes from two large-brain primate species (human and squirrel monkey) and one small-brain primate species (macaque) into microbe-free mice.  

Within eight weeks of making changes to the hosts’ microbiomes, they observed that the brains of mice with microbes from small-brain primates were indeed working differently than the brains of mice with microbes from large-brain primates. 

In the mice with large-brain primate microbes, the researchers found increased expression of genes associated with energy production and synaptic plasticity, the physical process of learning in the brain. In the mice with smaller-brain primate microbes, there was less expression of these processes. 

“What was super interesting is we were able to compare data we had from the brains of the host mice with data from actual macaque and human brains, and to our surprise, many of the patterns we saw in brain gene expression of the mice were the same patterns seen in the actual primates themselves,” Amato said. “In other words, we were able to make the brains of mice look like the brains of the actual primates the microbes came from.”

Another surprising discovery the researchers made was a pattern of gene expression associated with ADHD, schizophrenia, bipolar and autism in the genes of the mice with the microbes from smaller-brained primates. 

While there is existing evidence showing correlations between conditions like autism and the composition of the gut microbiome, there is a lack of data showing the gut microbes contribute to these conditions. 

“This study provides more evidence that microbes may causally contribute to these disorders —specifically, the gut microbiome is shaping brain function during development,” Amato said. “Based on our findings, we can speculate that if the human brain is exposed to the actions of the ‘wrong’ microbes, its development will change, and we will see symptoms of these disorders, i.e., if you don’t get exposed to the ‘right’ human microbes in early life, your brain will work differently, and this may lead to symptoms of these conditions.” 

Implications and next steps

Amato sees clinical implications for further exploration of the origins of some psychological disorders and for taking an evolutionary perspective on the way microbes affect brain physiology.

“It’s interesting to think about brain development in species and individuals and investigating whether we can look at cross-sectional, cross-species differences in patterns and discover rules for the way microbes are interacting with the brain, and whether the rules can be translated into development as well.

Primate gut microbiota induce evolutionarily salient changes in mouse neurodevelopment” was published by the Proceedings of the National Academy of Sciences on Jan. 5.

Source: Northwestern University