New research published in Neurologymay explain why migraine attacks are more common during menstruation. The researchers found that, as oestrogen levels fluctuate, for female migraine sufferers, levels of the protein calcitonin gene-related peptide (CGRP) that plays a key role in starting the migraine process also fluctuate.
“This elevated level of CGRP following hormonal fluctuations could help to explain why migraine attacks are more likely during menstruation and why migraine attacks gradually decline after menopause,” said study author Bianca Raffaelli, MD, of Charité – Universitätsmedizin Berlin. “These results need to be confirmed with larger studies, but we’re hopeful that they will help us better understand the migraine process.”
The matched cohort study involved three groups of female participants with episodic migraine, all with least three days with migraine in the month before the study. The groups were those with a regular menstrual cycle, those taking oral contraceptives, and those who had gone through menopause. Each group had 30 people, for a total of 180, and were age-matched to women without migraine history.
Researchers collected blood and tear fluid to determine CGRP levels. In those with regular menstrual cycles, the samples were taken during menstruation when oestrogen levels are low and around the time of ovulation, when levels are the highest. In those taking oral contraceptives, samples were taken during the hormone-free time and the hormone-intake time. Samples were taken once from postmenopausal participants at a random time.
The study found that female participants with migraine and a regular menstrual cycle had higher CGRP concentrations during menstruation than those without migraine. Those with migraine had blood levels of 5.95 picograms per millilitre (pg/ml) compared to 4.61pg/ml for those without migraine. For tear fluid, those with migraine had 1.20ng/ml compared to 0.4ng/ml for those without migraine.
In contrast, those taking oral contraceptives or were postmenopausal had similar CGRP levels in the migraine and non-migraine groups.
“The study also suggests that measuring CGRP levels through tear fluid is feasible and warrants further investigation, as accurate measurement in the blood is challenging due to its very short half-life,” Raffaelli said. “This method is still exploratory, but it is non-invasive.”
Raffaelli noted that while hormone levels were taken around the time of ovulation, they may not have been taken exactly on the day of ovulation, so the fluctuations in oestrogen levels may not be fully reflected.
In patients with Huntington’s disease, neurons in a part of the brain called the striatum are some of the worst affected. Degeneration of these neurons contributes to patients’ loss of motor control, which is one of the major hallmarks of the disease.
Neuroscientists at MIT have now shown that two distinct cell populations in the striatum are affected differently by Huntington’s disease. Reporting their results in Nature Communication, they believe that neurodegeneration of one of these populations leads to motor impairments, while damage to the other population, located in structures called striosomes, may explain the mood disorders that are often see in the early stages of the disease.
“As many as 10 years ahead of the motor diagnosis, Huntington’s patients can experience mood disorders, and one possibility is that the striosomes might be involved in these,” says Ann Graybiel, an MIT Institute Professor and one of the senior authors of the study.
Using single-cell RNA sequencing to analyse the genes expressed in mouse models of Huntington’s disease and postmortem brain samples from Huntington’s patients, the researchers found that cells of the striosomes and another structure, the matrix, begin to lose their distinguishing features as the disease progresses. The researchers hope that their mapping of the striatum and how it is affected by Huntington’s could help lead to new treatments that target specific cells within the brain.
This kind of analysis could also shed light on other brain disorders that affect the striatum, such as Parkinson’s disease and autism spectrum disorder, the researchers say.
Neuron vulnerability
Huntington’s disease leads to degeneration of brain structures called the basal ganglia, which are responsible for control of movement and also play roles in other behaviors, as well as emotions. For many years, Graybiel has been studying the striatum, a part of the basal ganglia that is involved in making decisions that require evaluating the outcomes of a particular action.
Many years ago, Graybiel discovered that the striatum is divided into striosomes, which are clusters of neurons, and the matrix, which surrounds the striosomes. She has also shown that striosomes are necessary for making decisions that require an anxiety-provoking cost-benefit analysis.
In a 2007 study, Richard Faull of the University of Auckland discovered that in postmortem brain tissue from Huntington’s patients, the striosomes showed a great deal of degeneration. Faull also found that while those patients were alive, many of them had shown signs of mood disorders such as depression before their motor symptoms developed.
To further explore the connections between the striatum and the mood and motor effects of Huntington’s, Graybiel teamed up with Kellis and Heiman to study the gene expression patterns of striosomal and matrix cells. To do that, the researchers used single-cell RNA sequencing to analyze human brain samples and brain tissue from two mouse models of Huntington’s disease.
Within the striatum, neurons can be classified as either D1 or D2 neurons. D1 neurons are involved in the “go” pathway, which initiates an action, and D2 neurons are part of the “no-go” pathway, which suppresses an action. D1 and D2 neurons can both be found within either the striosomes and the matrix.
The analysis of RNA expression in each of these types of cells revealed that striosomal neurons are harder hit by Huntington’s than matrix neurons. Furthermore, within the striosomes, D2 neurons are more vulnerable than D1.
The researchers also found that these four major cell types begin to lose their identifying molecular identities and become more difficult to distinguish from one another in Huntington’s disease. “Overall, the distinction between striosomes and matrix becomes really blurry,” Graybiel says.
Striosomal disorders
The findings suggest that damage to the striosomes, which are known to be involved in regulating mood, may be responsible for the mood disorders that strike Huntington’s patients in the early stages of the disease. Later on, degeneration of the matrix neurons likely contributes to the decline of motor function, the researchers say.
In future work, the researchers hope to explore how degeneration or abnormal gene expression in the striosomes may contribute to other brain disorders.
Previous research has shown that overactivity of striosomes can lead to the development of repetitive behaviors such as those seen in autism, obsessive compulsive disorder, and Tourette’s syndrome. In this study, at least one of the genes that the researchers discovered was overexpressed in the striosomes of Huntington’s brains is also linked to autism.
Additionally, many striosome neurons project to the part of the brain that is most affected by Parkinson’s disease (the substantia nigra, which produces most of the brain’s dopamine).
“There are many, many disorders that probably involve the striatum, and now, partly through transcriptomics, we’re working to understand how all of this could fit together,” Graybiel says.
A study published in PLoS ONE has confirmed the role of the corpus callosum in language lateralisation, ie the distribution of language processing functions between the brain’s hemispheres. To get to this finding, the researchers applied advanced neuroimaging methods to study subjects performing an innovative language task for their study.
Functional asymmetry between the two cerebral hemispheres in performing higher-level cognitive functions is a major characteristic of the human brain. For example, the left hemisphere plays a leading role in language processing in most people. However, between 10% and 15% of people also use the right hemisphere to varying degrees for the same task.
Traditionally, language lateralisation to the right hemisphere was explained by handedness, as it is mainly found in left-handed and ambidextrous (using both hands equally well) individuals. But recent research has demonstrated a genetic difference in the way language is processed by left-handed and ambidextrous people. In addition to this, some right-handed people also involve their right hemisphere in language functions.
These findings prompted the scientists to consider alternative explanations, especially by looking at brain anatomy to find out why language functions can shift to the right hemisphere. Researchers at the HSE Centre for Language and Brain hypothesised that language lateralisation may have something to do with the anatomy of the corpus callosum, the largest commissural tract in the human brain connecting the two cerebral hemispheres.
Tractography of the corpus callosumDo Tromp et al. / brainimaging.waisman.wisc.edu
The researchers asked 50 study participants to perform a sentence completion task. The subjects were instructed to read aloud a visually presented Russian sentence and to complete it with an appropriate final word (eg ‘Teper’ ministr podpisyvaet vazhnoe…‘ – ‘Now the minister is signing an important …’). At the same time, the participants’ brain activity was recorded using functional magnetic resonance imaging (fMRI). Additionally, the volume of the corpus callosum was measured in each subject.
A comparison between the fMRI data and the corpus callosum measurements revealed that the larger the latter’s volume, the less lateralisation of the language function to the right hemisphere was observed.
When processing language, the brain tends to use the left hemisphere’s resources efficiently and the corpus callosum suppresses any additional involvement of the right hemisphere. The larger a person’s corpus callosum, the less involved their right hemisphere is in language processing (and vice versa). This finding is consistent with the inhibitory model suggesting that the corpus callosum inhibits the action of one hemisphere while the other is engaged in cognitive tasks.
The study’s innovative design and use of advanced neuroimaging have made this conclusion possible. Brain lateralisation in language processing is usually hard to measure accurately, as typical speech tasks used in earlier studies (eg image naming, selecting words that begin with a certain letter or listening to speech) tend to cause activation only in some parts of the brain responsible for language functions but not in others. Instead, we developed a unique speech task for fMRI: sentence completion, which reliably activates all language areas of the brain.
The researchers reconstructed the volume and properties of the corpus callosum from MRI data using an advanced tractography technique: constrained spherical deconvolution (CSD). This is more suitable than traditional diffusion tensor imaging for modelling crossing fibres in the smallest unit of volume, the voxel (3D pixel), and is therefore more reliable.
Researchers have found that compounds in the edible Lions’ Mane mushroom (Hericium erinaceus), already used in herbal medicine for stomach complaints, can promote nerve growth and boost memory. Their findings are published in The Journal of Neurochemistry.
The compounds are neurotrophins – a family of proteins associated with the growth, functioning and maintenance of neurons. In mammals, brain-derived neurotrophic factor (BDNF) is highly expressed in central nervous system neurons. Because BDNF pathway impairment is associated with several diseases, including schizophrenia, Alzheimer’s disease, Rett syndrome, and Huntington’s disease, experimental treatments for neurological and neurodegenerative disorders often target neutrophins.
So far, such treatments have run into problems with crossing the blood-brain barrier and off-target effects. H. erinaceus is high in neutrophins, though it has traditionally been used in herbal medicine to treat stomach complaints and cancer. It is also known for promoting peripheral nerve regeneration.
Two of the biologically active types of compounds, hericenones and erinacines, can successfully cross the the blood-brain barrier and confer neuroprotective effects.
The researchers tested crude and purified H. erinaceus extracts and found that they exhibited BDNF-like neurotrophic activity in both in vitro cultured hippocampal neurons and in mouse models of hippocampal memory. The extracts also promoted neurite outgrowth and improved memory.
They concluded that Hericene A acts through a novel signalling pathway, giving rise to improved cognitive performance. These findings however will need to be validated by future research.
When using psychedelic drugs to treat mental illness, it’s all down to location when rapidly rebuilding connections between nerve cells. In their paper published in Science, scientists show that engaging serotonin 2A receptors inside neurons promotes growth of new connections – but engaging the same receptor on the surface of nerve cells does not.
The findings will help guide efforts to discover new drugs for depression, PTSD and other disorders, according to senior author David E. Olson, associate professor at the University of California, Davis.
Drugs such as LSD, MDMA and psilocybin show great promise for treating a wide range of mental disorders that are characterised by a loss of neural connections. In laboratory studies, a single dose of these drugs can cause rapid growth of new dendrites from nerve cells, and formation of new spines on those dendrites.
Olson calls this group of drugs “psychoplastogens” because of their ability to regrow and remodel connections in the brain.
Earlier work from Olson’s and other labs showed that psychedelic drugs work by engaging the serotonin 2A receptor (5-HT2AR). But other drugs that engage the same receptor, including serotonin, do not have the same growth effects.
Maxemiliano Vargas, a graduate student in Olson’s lab, Olson and colleagues experimented with chemically tweaking drugs and using transporters to make it easier or harder for compounds to slip across cell membranes. Serotonin itself is polar, meaning it dissolves well in water but does not easily cross the lipid membranes that surround cells. The psychedelics, on the other hand, are much less polar and can easily enter the interior of a cell.
They found that the growth-promoting ability of compounds was correlated with the ability to cross cell membranes.
Drug receptors are usually thought of as being located on the cell membrane, facing out. But the researchers found that in nerve cells, serotonin 2A receptors were concentrated inside cells, mostly around a structure called the Golgi body, with some receptors on the cell surface. Other types of signalling receptors in the same class were on the surface.
The results show that there is a location bias in how these drugs work, Olson said. Engaging the serotonin 2A receptor when it is inside a cell produces a different effect from triggering it when it is on the outside.
“It gives us deeper mechanistic insight into how the receptor promotes plasticity, and allows us to design better drugs,” Olson said.
Most people remember emotional events, like their wedding day, very clearly, but researchers are not sure how the human brain prioritises emotional events in memory. In a study published in Nature Human Behaviour, Joshua Jacobs, associate professor of biomedical engineering at Columbia Engineering, and his team identified a specific neural mechanism in the human brain that tags information with emotional associations for enhanced memory.
The team demonstrated that high-frequency brain waves in the amygdala, a hub for emotional processes, and the hippocampus, a hub for memory processes, are critical to enhancing memory for emotional stimuli. Disruptions to this neural mechanism, brought on either by electrical brain stimulation or depression, impair memory specifically for emotional stimuli.
Rising prevalence of memory disorders
The rising prevalence of memory disorders such as dementia has highlighted the damaging effects that memory loss has on individuals and society. Disorders such as depression, anxiety, and post-traumatic stress disorder (PTSD) can also feature imbalanced memory processes, especially with the COVID pandemic. Understanding how the brain naturally regulates what information gets prioritised for storage and what fades away could provide critical insight for developing new therapeutic approaches to strengthening memory for those at risk of memory loss, or for normalising memory processes in those at risk of dysregulation.
“It’s easier to remember emotional events, like the birth of your child, than other events from around the same time,” says Salman E. Qasim, lead author of the study, who started this project during his PhD in Jacobs’ lab at Columbia Engineering. “The brain clearly has a natural mechanism for strengthening certain memories, and we wanted to identify it.”
The difficulty of studying neural mechanisms in humans
Most investigations into neural mechanisms take place in animals such as rats, because such studies require direct access to the brain to record brain activity and perform experiments that demonstrate causality, such as careful disruption of neural circuits. But it is difficult to observe or characterise a complex cognitive phenomenon like emotional memory enhancement in animal studies.
To study this process directly in humans. Qasim and Jacobs analysed data from memory experiments conducted with epilepsy patients undergoing direct, intracranial brain recording for seizure localisation and treatment. During these recordings, epilepsy patients memorised lists of words while the electrodes placed in their hippocampus and amygdala recorded the brain’s electrical activity.
Studying brain-wave patterns of emotional words
Qasim found that participants remembered more emotionally rated words, such as “dog” or “knife,” better than more neutral words, such as “chair.” Whenever participants successfully remembered emotional words, high-frequency neural activity (30-128 Hz) would become more prevalent in the amygdala-hippocampal circuit, a pattern which was absent when participants remembered more neutral words, or failed to remember a word altogether. Analysing 147 participant, they found a clear link between participants’ enhanced memory for emotional words and the prevalence in their brains of high-frequency brain waves across the amygdala-hippocampal circuit.
“Finding this pattern of brain activity linking emotions and memory was very exciting to us, because prior research has shown how important high-frequency activity in the hippocampus is to non-emotional memory,” said Jacobs. “It immediately cued us to think about the more general, causal implications – if we elicit high-frequency activity in this circuit, using therapeutic interventions, will we be able to strengthen memories at will?”
Electrical stimulation disrupts memory for emotional words
In order to establish whether this high-frequency activity actually reflected a causal mechanism, Jacobs and his team formulated a unique approach to replicate the kind of experimental disruptions typically reserved for animal research. First, they analysed a subset of these patients who had performed the memory task while direct electrical stimulation was applied to the hippocampus for half of the words that participants had to memorise. They found that electrical stimulation, which has a mixed history of either benefiting or diminishing memory depending on its usage, clearly and consistently impaired memory specifically for emotional words.
Uma Mohan, another PhD student in Jacobs’ lab at the time and co-author on the paper, noted that this stimulation also diminished high-frequency activity in the hippocampus. This provided causal evidence that, by knocking out the brain activity pattern correlating with emotional memory, stimulation was also selectively diminishing emotional memory.
Depression acts similarly to brain stimulation
Qasim further hypothesized that depression, which can involve dysregulated emotional memory, might act similarly to brain stimulation. He analyzed patients’ emotional memory in parallel with mood assessments the patients took to characterize their psychiatric state. And, in fact, in the subset of patients with depression, the team observed a concurrent decrease in emotion-mediated memory and high-frequency activity in the hippocampus and amygdala.
“By combining stimulation, recording, and psychometric assessment, they were able to demonstrate causality to a degree that you don’t always see in studies with human brain recordings,” said Bradley Lega, a neurosurgeon and scientist at the University of Texas Southwestern Medical Center and not an author on the paper. “We know high-frequency activity is associated with neuronal firing, so these findings open new avenues of research in humans and animals about how certain stimuli engage neurons in memory circuits.”
Next steps
Qasim is now investigating how individual neurons in the human brain fire during emotional memory processes. Qasim and Jacobs hope that their work might also inspire animal research exploring how this high-frequency activity is linked to norepinephrine, a neurotransmitter linked to attentional processes that they theorise might be behind the enhanced memory for emotional stimuli. They also hope that future research will target high-frequency activity in the amygdala-hippocampal circuit to protect memory.
“Our emotional memories are one of the most critical aspects of the human experience, informing everything from our decisions to our entire personality,” Qasim added. “Any steps we can take to mitigate their loss in memory disorders or prevent their hijacking in psychiatric disorders is hugely exciting.”
The vital role of the ‘love hormone’ oxytocin for social attachments is being called into question. More than 40 years of pharmacological and behavioural research has pointed to oxytocin receptor signalling as an essential pathway for the development of social behaviours in prairie voles, humans, and other species, but a genetic study published in the journal Neuron shows that voles can form enduring attachments with mates and provide parental care without oxytocin receptor signalling.
Prairie voles are one of only a few monogamous mammalian species. After mating, they form lifelong partnerships known as “pair-bonds.” Pair-bonded voles share parental responsibilities, prefer the company of their partner over unknown members of the opposite sex, and actively reject potential new partners. Previous studies that used drugs to block oxytocin from binding to its receptor found that voles were unable to pair-bond when oxytocin signalling was blocked.
Neuroscientists Devanand Manoli of UCSF and Nirao Shah of Stanford University wanted to know whether pair-bonding was really controlled by oxytocin receptor signalling. To test this, they used CRISPR to generate prairie voles that lack functional oxytocin receptors. Then, they tested these mutant oxytocin-receptor-less voles to see whether they could form enduring partnerships with other voles. To their surprise, the mutant voles formed pair-bonds just as readily as normal voles.
“We were all shocked that no matter how many different ways we tried to test this, the voles demonstrated a very robust social attachment with their sexual partner, as strong as their normal counterparts,” says Manoli.
Next, the researchers wondered whether oxytocin receptor signaling is similarly dispensable for its other functions – parturition, parenting (which, in prairie voles, is a shared responsibility between the two parents), and milk release during lactation.
“We found that mutant voles are not only able to give birth, but actually nurse,” says Shah. Both male and female mutants engaged in the usual parental behaviours of huddling, licking, and grooming, and were able to rear pups to weaning age.
However, the mutant prairie voles did have limited milk release compared to normal voles. As a result, fewer of their pups survived to weaning age, and those that did survive were smaller compared to the pups of normal prairie voles. The fact that the voles could nurse at all is in contrast to equivalent studies in oxytocin receptor-deficient mice, who completely failed to lactate or nurse, and whose pups consequently died within a day or so of being born. The authors hypothesize that this species difference could be due to the inbred nature of laboratory mouse strains in contrast to the genetically heterogenous voles. “It could be that inbreeding in mice has selected for a large dependence on oxytocin signalling, or this may represent a species-specific role of oxytocin receptor signalling,” says Shah.
When asked why their results differ from previously published studies that used drugs to block oxytocin receptor signalling, the authors point to the key difference between genetic and pharmacological studies: precision. “Drugs can be dirty,” says Manoli, “in the sense that they can bind to multiple receptors, and you don’t know which binding action is causing the effect. From a genetics perspective, we now know that the precision of deleting this one receptor, and subsequently eliminating its signalling pathways, does not interfere with these behaviours.”
“For at least the last ten years people have been hoping for the possibility of oxytocin as a powerful therapeutic for helping people with social cognitive impairments due to conditions ranging from autism to schizophrenia,” Manoli says. “This research shows that there likely isn’t a magic bullet for something as complex and nuanced as social behaviour.”
Another key difference is that, whereas most pharmacological studies suppress oxytocin receptor signalling in adult animals, this study switched it off when the voles were embryos. “We’ve made a mutation that starts from before birth,” says Shah. “It could be that there are compensatory or redundant pathways that kick-in in these mutant animals and mask the deficits in attachment, parental behaviours, and milk let-down.”
Working with prairie voles presented an obstacle, but one worth overcoming. Because prairie voles are not commonly used in genetic studies like laboratory mice, the team needed to develop all of their molecular tools and protocols from scratch. Now that they have these vole-specific pipelines and tools, the authors are excited about the doorways this opens, both for them and for other researchers.
“We’re very happy to be part of a community and to have this technology that we can share,” says Manoli. “Now we have this trove that we can start to mine. There are so many other questions that prairie voles could be interesting and useful for answering, both in terms of potential clinical implications for models of anxiety or attachment and also for basic comparative biology.”
Observations of the brain when the body is immersed in cold water reveal changes in connectivity between areas which process emotion, which could explain why people often feel more upbeat and alert after swimming outside or taking cold baths.
During a research trial, published in the journal Biology, 33 healthy volunteers were given a functional MRI (fMRI) scan immediately after bathing in cold water. The team included imaging experts from Bournemouth University and University Hospitals Dorset (UHD), and extreme environments researcher, Dr Heather Massey, from the University of Portsmouth.
Dr Massey, from the School of Sport, Health and Exercise Science, said: “It has been a really pleasing experience to work with this interdisciplinary team to develop a method and publish this piece of research that could only be completed by a group with such a diverse skill set.
“With the growing popularity of outdoor swimming and cold water immersion, which many now use to support improved mood, it is long overdue that we study how it may affect us. We know so much about the impact cold water immersion can have on the body, but the brain has had little focus, primarily as it has been more challenging to study. It is only now that technology is developing, can we start to get some insight.”
Dr Ala Yankouskaya, Senior Lecturer in Psychology at Bournemouth University, led the study. She said: “The benefits of cold-water immersion are widely known from previous studies where participants were questioned on how they feel afterwards, but we wanted to see how the shock of going into the cold water actually affects the brain.”
Each participant was given an initial fMRI scan and then immersed in a pool of water at 20°C for five minutes whilst an ECG and respiratory equipment measured their bodies’ physiological responses. After being quickly dried they were given a second fMRI scan so the team could look for any changes in their brains’ activity.
“All tiny parts of the brain are connected to each other in a certain pattern when we carry out activities in our day-to-day lives, so the brain works as a whole.” said Dr Yankouskaya. “After our participants went in the cold water, we saw the physiological effects – such as shivering and heavy breathing. The MRI scans then showed us how the brain rewires its connectivity to help the person cope with the shock.”
Comparing the scans showed that changes had occurred in the connectivity between specific parts of the brain, in particular, the medial prefrontal cortex and the parietal cortex.
“These are the parts of the brain that control our emotions, and help us stay attentive and make decisions,” Dr Yankouskaya said. “So when the participants told us that they felt more alert, excited and generally better after their cold bath, we expected to see changes to the connectivity between those parts. And that is exactly what we found.”
The team are now planning to use their findings to understand more about the wiring and interactions between parts of the brain for people with mental health conditions.
“The medial prefrontal cortex and parietal cortex have different wiring when people have conditions such as depression and anxiety,” Dr Yankouskaya explained.
“Learning how cold water can rewire these parts of the brain could help us understand why the connectivity is so different for people with these conditions, and hopefully, in the long-term, lead to alternative treatments,” she concluded.
Regularly eating a high fat/calorie diet could reduce the brain’s ability to regulate calorie intake, according to a study published in The Journal of Physiology. Rat studies revealed a signalling pathway which causes a quick response to high fat/high calorie intake, reducing food and calorie intake. But continuously eating a high fat/calorie diet seems to disrupt this signalling pathway, sabotaging this short-term protection.
Senior author Dr Kirsteen Browning said, “Calorie intake seems to be regulated in the short-term by astrocytes. We found that a brief exposure (three to five days) of high fat/calorie diet has the greatest effect on astrocytes, triggering the normal signalling pathway to control the stomach. Over time, astrocytes seem to desensitise to the high fat food. Around 10–14 days of eating high fat/calorie diet, astrocytes seem to fail to react and the brain’s ability to regulate calorie intake seems to be lost. This disrupts the signalling to the stomach and delays how it empties.”
Astrocytes initially react when high fat/calorie food is ingested, triggering the release of gliotransmitters, chemicals (including glutamate and ATP) that excite nerve cells and enable normal signalling pathways to stimulate neurons that control stomach function. This ensures the stomach contracts correctly to fill and empty in response to food passing through the digestive system. When astrocytes are inhibited, the cascade is disrupted. The decrease in signalling chemicals leads to a delay in digestion because the stomach doesn’t fill and empty appropriately.
The vigorous investigation used behavioural observation to monitor food intake in rats which were fed a control or high fat/calorie diet for one, three, five or 14 days. This was combined with pharmacological and specialist genetic approaches (both in vivo and in vitro) to target distinct neural circuits, which enabled the researchers to specifically inhibit astrocytes in a particular region of the brainstem. In this way, they assessed the response of individual neurons.
Human studies will need to be carried out to confirm if the same mechanism occurs in humans. If this is the case, further testing will be required to assess if the mechanism could be safely targeted without disrupting other neural pathways.
The researchers have plans to further explore the mechanism. Dr Browning said, “We have yet to find out whether the loss of astrocyte activity and the signalling mechanism is the cause of overeating or that it occurs in response to the overeating. We are eager to find out whether it is possible to reactivate the brain’s apparent lost ability to regulate calorie intake. If this is the case, it could lead to interventions to help restore calorie regulation in humans.”
In the open-access journal PLOS Biology, researchers present the first evidence of 12-hour cycles of gene activity in the human brain. Led by Madeline R. Scott, the study also reveals that some of those 12-hour rhythms are missing or altered in the postmortem brains of patients with schizophrenia.
Schizophrenia patients are known to have disturbances in several types of 24-hour bodily rhythms, including sleep/wake cycles, hormone levels, and gene activity in the prefrontal cortex of the brain. However, virtually nothing is known about gene activity in the brain for cycles that are shorter than the usual 24-hour circadian rhythm. A few years ago, researchers discovered that certain genes in the body were associated with 12-hour bodily rhythms, which may have an origin in the 12-hour cycle of ocean tides.
As it is not possible to measure gene transcript levels in living brains, the new study instead used a time-of-death analysis to search for 12-hour rhythms in gene activity within postmortem brains. They focused on the dorsolateral prefrontal cortex as it is associated with cognitive symptoms and other abnormalities in gene expression rhythms that have been observed in schizophrenia.
Numerous genes in the normal dorsolateral prefrontal cortex were found to have 12-hour rhythms in activity. Among them, gene activity levels related to building connections between neurons peaked in the afternoon/night, while those related to mitochondrial function (and therefore cellular energy supply) peaked in the morning/evening.
In contrast, postmortem brains from patients with schizophrenia contained fewer genes with 12-hour activity cycles, and those related to neural connections were missing entirely. Additionally, although the mitochondria-related genes did maintain a 12-hour rhythm, their activity did not peak at the normal times. Whether these abnormal rhythms underlie the behavioural abnormalities in schizophrenia, or whether they result from medications, nicotine use, or sleep disturbances should be examined in future studies.
Co-author Colleen A. McClung adds: “We find that the human brain has not only circadian (24 hour) rhythms in gene expression but also 12-hour rhythms in a number of genes that are important for cellular function and neuronal maintenance. Many of these gene expression rhythms are lost in people with schizophrenia, and there is a dramatic shift in the timing of rhythms in mitochondrial-related transcripts which could lead to suboptimal mitochondrial function at the times of day when cellular energy is needed the most.”