Category: Neurology

How Spinal Cords can ‘Learn’ without Brain Involvement

In this study, spinal cords that associated limb position with an unpleasant experience learned to reposition the limb after only 10 minutes, and retained a memory the next day. Spinal cords that received random unpleasantness did not learn. Credit: RIKEN

Researchers in Japan have discovered the neural circuitry in the spinal cord that allows brain-independent motor learning. This study by Aya Takeoka at the RIKEN Center for Brain Science and colleagues found two critical groups of spinal cord neurons, one necessary for new adaptive learning, and another for recalling adaptations once they have been learned. The findings, published in Science, could help scientists develop ways to assist motor recovery after spinal cord injury.

It has been long been known that motor output from the spinal cord can be adjusted through practice even without a brain. This has been shown most dramatically in headless insects, whose legs can still be trained to avoid external cues. Until now, no one has figured out exactly how this is possible, and without this understanding, the phenomenon is not much more than a quirky fact. As Takeoka explains, “Gaining insights into the underlying mechanism is essential if we want to understand the foundations of movement automaticity in healthy people and use this knowledge to improve recovery after spinal cord injury.”

Before jumping into the neural circuitry, the researchers first developed an experimental setup that allowed them to study mouse spinal cord adaptation, both learning and recall, without input from the brain. Each test had an experimental mouse and a control mouse whose hindlegs dangled freely. If the experimental mouse’s hindleg drooped down too much it was electrically stimulated, emulating something a mouse would want to avoid. The control mouse received the same stimulation at the same time, but not linked to its own hindleg position.

After just 10 minutes, they observed motor learning only in the experimental mice; their legs remained high up, avoiding any electrical stimulation. This result showed that the spinal cord can associate an unpleasant feeling with leg position and adapt its motor output so that the leg avoids the unpleasant feeling, all without any need for a brain. Twenty-four hours later, they repeated the 10-minute test but reversed the experimental and control mice. The original experimental mice still kept their legs up, indicating that the spinal cord retained a memory of the past experience, which interfered with new learning.

Having thus established both immediate learning, as well as memory, in the spinal cord, the team then set out to examine the neural circuitry that makes both possible. They used six types of transgenic mice, each with a different set of spinal neurons disabled, and tested them for motor learning and learning reversal. They found that mice hindlimbs did not adapt to avoid the electrical shocks after neurons toward the top of the spinal cord were disabled, particularly those that express the gene Ptf1a.

When they examined the mice during learning reversal, they found that silencing the Ptf1a-expressing neurons had no effect. Instead, a group of neurons in the ventral part of the spinal cord that express the En1 gene was critical. When these neurons were silenced the day after learning avoidance, the spinal cords acted as if they had never learned anything. The researchers also assessed memory recall on the second day by repeating the initial learning conditions. They found that in wildtype mice, hindlimbs stabilised to reach the avoidance position faster than they did on the first day, indicating recall. Exciting the En1 neurons during recall increased this speed by 80%, indicating enhanced motor recall.

“Not only do these results challenge the prevailing notion that motor learning and memory are solely confined to brain circuits,” says Takeoka, “but we showed that we could manipulate spinal cord motor recall, which has implications for therapies designed to improve recovery after spinal cord damage.”

Source: RIKEN

Spinal Surgeons can Now Monitor their Procedure’s Effects Mid-surgery

Photo by Natanael Melchor on Unsplash

With technology developed at UC Riverside, scientists can, for the first time, make high resolution images of the human spinal cord during surgery. The advancement could help bring real relief to millions suffering chronic back pain.

The technology, known as fUSI or functional ultrasound imaging, not only enables clinicians to see the spinal cord, but also enables them to map the cord’s response to various treatments in real time. A paper published today in the journal Neuron details how fUSI worked for six people undergoing electrical stimulation for chronic back pain treatment.

“The fUSI scanner is freely mobile across various settings and eliminates the requirement for the extensive infrastructure associated with classical neuroimaging techniques, such as functional magnetic resonance imaging (fMRI),” said Vasileios Christopoulos, assistant professor of bioengineering at UCR who helped develop the technology. “Additionally, it offers ten times the sensitivity for detecting neuroactivation compared to fMRI.”

Until now, it has been difficult to evaluate whether a back pain treatment is working since patients are under general anaesthesia, unable provide verbal feedback on their pain levels during treatment. “With ultrasound, we can monitor blood flow changes in the spinal cord induced by the electrical stimulation. This can be an indication that the treatment is working,” Christopoulos said.

The spinal cord is an “unfriendly area” for traditional imaging techniques due to significant motion artifacts, such as heart pulsation and breathing. “These movements introduce unwanted noise into the signal, making the spinal cord an unfavorable target for traditional neuroimaging techniques,” Christopoulos said.

By contrast, fUSI is less sensitive to motion artifacts, using echoes from red blood cells in the area of interest to generate a clear image. “It’s like submarine sonar, which uses sound to navigate and detect objects underwater,” Christopoulos said. “Based on the strength and speed of the echo, they can learn a lot about the objects nearby.”

Christopoulos partnered with the USC Neurorestoration Center at Keck Hospital to test the technology on six patients with chronic low back pain. These patients were already scheduled for the last-ditch pain surgery, as no other treatments, including drugs, had helped to ease their suffering. For this surgery, clinicians stimulated the spinal cord with electrodes, in the hopes that the voltage would alleviate the patient’s discomfort and improve their quality of life.

“If you bump your hand, instinctively, you rub it. Rubbing increases blood flow, stimulates sensory nerves, and sends a signal to your brain that masks the pain,” Christopoulos said. “We believe spinal cord stimulation may work the same way, but we needed a way to view the activation of the spinal cord induced by the stimulation.”

The Neuron paper details how fUSI can detect blood flow changes at unprecedented levels of less than 1mm/s. For comparison, fMRI is only able to detect changes of 2cm/s.

“We have big arteries and smaller branches, the capillaries. They are extremely thin, penetrating your brain and spinal cord, and bringing oxygen places so they can survive,” Christopoulos said. “With fUSI, we can measure these tiny but critical changes in blood flow.”

Generally, this type of surgery has a 50% success rate, which Christopoulos hopes will be dramatically increased with improved monitoring of the blood flow changes. “We needed to know how fast the blood is flowing, how strong, and how long it takes for blood flow to get back to baseline after spinal stimulation. Now, we will have these answers,” Christopoulos said.

Moving forward, the researchers are also hoping to show that fUSI can help optimise treatments for patients who have lost bladder control due to spinal cord injury or age. “We may be able to modulate the spinal cord neurons to improve bladder control,” Christopoulos said.

“With less risk of damage than older methods, fUSI will enable more effective pain treatments that are optimised for individual patients,” Christopoulos said. “It is a very exciting development.”

Source: University of California Riverside

Essential Tremor Increases Cognitive Impairment Risks over Time

Photo by Matthias Zomer on Pexels

Essential tremor, a nervous system disorder that causes rhythmic shaking, is one of the most common movement disorders. A new study published in the Annals of Neurology reveals details on the increased risk of mild cognitive impairment (MCI) and dementia that individuals with essential tremor may face.

The research represents the longest available longitudinal prospective study of rates of MCI and dementia in people with essential tremor. The study enrolled 222 patients, 177 of whom participated in periodic evaluations over an average follow-up of 5 years.

Investigators observed a cumulative prevalence of 26.6% and 18.5% for MCI and dementia, respectively. They also noted a cumulative incidence of 18.2% and 11.2% for MCI and dementia, respectively. Each year, 3.9% of patients with normal cognition “converted” to having MCI, and 12.2% of those with MCI “converted” to having dementia.

“We know from related research that the presence of cognitive impairment in patients with essential tremor has meaningful clinical consequences. For example, patients with essential tremor who are diagnosed with dementia are more likely to need to use a walker or wheelchair, to employ a home health aide, and to reside in non-independent living arrangements than are patients with essential tremor without dementia,” said corresponding author Elan D. Louis, MD, MS, of the University of Texas Southwestern Medical Center. “With this in mind, the findings of the present study highlight the importance of cognitive screening and monitoring in patients with essential tremor. Early detection of impairment may provide opportunities for interventions that may slow further cognitive decline and improve the quality of life of patients and their families.”

Source: Wiley

A Single Gene Variant that Gave Rise to Humans’ Unique Skull Base

Source: CC0

One of the unique features that Homo sapiens have compared with other closely related hominin species and primates is the shape of the base of the skull, which enabled larger brains to evolve. Now, in a study recently published in the American Journal of Human Genetics, a team from Tokyo Medical and Dental University (TMDU), the University of Helsinki, and the University of Barcelona has analysed a genomic variant responsible for this unique human skull base morphology.

Most of the genomic changes that occurred during human evolution did not occur directly to genes themselves, but in regions responsible for controlling and regulating the expression of genes. Variants in these same regions are often involved in genetic conditions, causing aberrant gene expression throughout development. Identifying and characterising such genomic changes is therefore crucial for understanding human development and disease.

The development of the basicranial region, the base of the skull where it joins the vertebra, was key in the evolution of Homo sapiens, as we developed a highly flexed skull base that allowed our increased brain size. Therefore, variants that affect the development of this region are likely to have been highly significant in our evolution.

First, the team searched for variants in just a single letter of the DNA code, called single nucleotide polymorphisms (SNPs), that caused different regulation of genes in the basicranial region in Homo sapiens compared with other extinct hominins. One of these SNPs stood out, located in a gene called TBX1.

They then used cell lines to show that the SNP, called “rs41298798,” is located in a region that regulates the expression levels of the TBX1 gene, and that the “ancestral” form of the SNP, found in extinct hominins, is associated with lower TBX1 expression, while the form found in Homo sapiens gives us higher levels of TBX1.

“We then employed a mouse model with lower TBX1 expression,” explains lead author Noriko Funato, “which resulted in distinct alterations to the morphology at the base of the skull and premature hardening of a cartilage joint where the bones fuse together, restricting the growth ability of the skull.” The changes in the Tbx1-knockout mice were reminiscent of the known basicranial morphology of Neanderthals.

These morphological changes are also reflected in human genetic conditions associated with lower TBX1 gene dosage, such as DiGeorge syndrome and velocardiofacial syndrome, further indicating the significance of this genetic variant in the evolution of our unique skull base morphology.

The identification of this genomic variant sheds light on human evolution, as well as providing insight into common genetic conditions associated with lower expression of the TBX1 gene, paving the way for greater understanding and management of these conditions.

Source: Tokyo Medical and Dental University

How the Brain’s Working Memory… Works

Photo by Alex Green on Unsplash

Cedars-Sinai investigators have discovered how brain cells responsible for working memory – which holds onto things like phone numbers while we use them – coordinate intentional focus and short-term storage of information. Their discovery, which confirms the involvement of the hippocampus, is published in the journal Nature.

“We have identified for the first time a group of neurons, influenced by two types of brain waves, that coordinate cognitive control and the storage of sensory information in working memory,” said Jonathan Daume, PhD, a postdoctoral scholar in the Rutishauser Lab at Cedars-Sinai and first author of the study. “These neurons don’t contain or store information, but are crucial to the storage of short-term memories.”

Working memory, which requires the brain to store information for only seconds, is fragile and requires continued focus to be maintained, explained senior study author Ueli Rutishauser, PhD, director of the Center for Neural Science and Medicine at Cedars-Sinai. It can be affected by different diseases and conditions.

“In disorders such as Alzheimer’s disease or attention-deficit hyperactivity disorder, it is often not memory storage, but rather the ability to focus on and retain a memory once it is formed that is the problem,” said Rutishauser, who is a professor of Neurosurgery, Neurology and Biomedical Sciences at Cedars-Sinai. “We believe that understanding the control aspect of working memory will be fundamental for developing new treatments for these and other neurological conditions.”

To explore how working memory functions, investigators recorded the brain activity of 36 hospitalised patients who had electrodes surgically implanted in their brains as part of an epilepsy diagnosis procedure. The team recorded the activity of individual brain cells and brain waves while the patients performed a task that required use of working memory.

On a computer screen, patients were shown either a single photo or a series of three photos of various people, animals, objects or landscapes. Next, the screen went blank for just under three seconds, requiring patients to remember the photos they just saw. They were then shown another photo and asked to decide whether it was the one (or one of the three) they had seen before.

When patients performing the working memory task were able to respond quickly and accurately, investigators noted the firing of two groups of neurons: “category” neurons that fire in response to one of the categories shown in the photos, such as animals, and “phase-amplitude coupling,” or PAC, neurons.

PAC neurons, newly identified in this study, don’t hold any content, but use a process called phase-amplitude coupling to ensure the category neurons focus and store the content they have acquired. PAC neurons fire in time with the brain’s theta waves, which are associated with focus and control, as well as to gamma waves, which are linked to information processing. This allows them to coordinate their activity with category neurons, which also fire in time to the brain’s gamma waves, enhancing patients’ ability to recall information stored in working memory.

“Imagine when the patient sees a photo of a dog, their category neurons start firing ‘dog, dog, dog’ while the PAC neurons are firing ‘focus/remember,'” Rutishauser said. “Through phase-amplitude coupling, the two groups of neurons create a harmony superimposing their messages, resulting in ‘remember dog.’ It is a situation where the whole is greater than the sum of its parts, like hearing the musicians in an orchestra play together. The conductor, much like the PAC neurons, coordinates the various players to act in harmony.”

PAC neurons do this work in the hippocampus, a part of the brain that has long been known to be important for long-term memory. This study offers the first confirmation that the hippocampus also plays a role in controlling working memory, Rutishauser said.

Source: Cedars-Sinai Medical Center

Do More Mentally Challenging Jobs Protect against Cognitive Decline?

Source: Unsplash CC0

The harder your brain works at your job, the less likely you may be to have memory and thinking problems later in life, according to a new study published in Neurology®, the medical journal of the American Academy of Neurology. This study does not prove that stimulating work prevents mild cognitive impairment. It only shows an association.

“We examined the demands of various jobs and found that cognitive stimulation at work during different stages in life – during your 30s, 40s, 50s and 60s – was linked to a reduced risk of mild cognitive impairment after the age of 70,” said study author Trine Holt Edwin, MD, PhD, of Oslo University Hospital in Norway.

“Our findings highlight the value of having a job that requires more complex thinking as a way to possibly maintain memory and thinking in old age.”

The study looked at 7000 people and 305 occupations in Norway. Researchers measured the degree of cognitive stimulation that participants experienced while on the job. They measured the degree of routine manual, routine cognitive, non-routine analytical, and non-routine interpersonal tasks, which are skill sets that different jobs demand.

Routine manual tasks demand speed, control over equipment, and often involve repetitive motions, typical of factory work. Routine cognitive tasks demand precision and accuracy of repetitive tasks, such as in bookkeeping and filing.

Non-routine analytical tasks involve analysing information, engaging in creative thinking and interpreting information for others. Non-routine interpersonal tasks include establishing and maintaining personal relationships, motivating others and coaching. Non-routine cognitive jobs include public relations and computer programming.

Researchers divided participants into four groups based on the degree of cognitive stimulation that they experienced in their jobs. The most common job for the group with the highest cognitive demands was teaching. The most common jobs for the group with the lowest cognitive demands were mail carriers and custodians.

After age 70, participants completed memory and thinking tests to assess whether they had mild cognitive impairment. Of those with the lowest cognitive demands, 42% were diagnosed with mild cognitive impairment, compared to 27% for those with the highest cognitive demands.

After adjustment for age, sex, education, income and lifestyle factors, the group with the lowest cognitive demands at work had a 66% higher risk of mild cognitive impairment compared to the group with the highest cognitive demands at work.

“These results indicate that both education and doing work that challenges your brain during your career play a crucial role in lowering the risk of cognitive impairment later in life,” Edwin said. “Further research is required to pinpoint the specific cognitively challenging occupational tasks that are most beneficial for maintaining thinking and memory skills.”

A limitation of the study was that even within identical job titles, individuals might perform different tasks and experience different cognitive demands.

Source: American Academy of Neurology

New Drug Shows Promise for Treating Rare and Aggressive Gliomas

MRI scan showing brain cancer. Credit: Michelle Monje, MD, PhD, Stanford University

An experimental drug may provide a new treatment option for some patients with rare incurable brain tumours, according to an analysis published in the Journal of Clinical Oncology.

Diffuse midline gliomas are diagnosed in about 800 people per year in the U.S., according to the Centers for Disease Control and Prevention.

A subset of particularly aggressive diffuse midline gliomas are caused by a H3 K27M mutation and the only effective treatment is radiation, as the location of the tumour in the brain makes surgery difficult. Even with radiation, relapse is virtually inevitable and more than 70% of patients with this subtype of brain tumour die from the cancer, according to the National Institutes of Health.

In the study, investigators analysed the results of five previous clinical trials testing the effectiveness of dordaviprone, an experimental drug which works by blocking a certain protein in tumours with the mutation.

The study included results from 50 patients (including four children) with H3 K27M–mutant diffuse midline gliomas and found that 30% of patients responded well to the drug. The most common side effect reported was fatigue, according to the study.

Now, the researchers are launching a trial at Northwestern Medicine hospitals to investigate the drug’s effectiveness in newly diagnosed patients.

Source: Northwestern University

Epigenetic Changes Drive this Rare Malignant Paediatric Brain Tumour

A healthy neuron. Credit: NIH

A new study published in Life Science Alliance revealed how aberrant epigenetic regulation contributes to the development of atypical teratoid/rhabdoid (AT/RT) tumours, which mainly affect young children. There is an urgent need for more research in this area as current treatment options are ineffective against these highly malignant tumours.

Most tumours take a long time to develop as harmful mutations gradually accumulate in cells’ DNA over time. AT/RT tumours are a rare exception, because the inactivation of one gene gives rise to this highly aggressive form of brain cancer.

AT/RT tumours are rare central nervous system embryonic tumours that predominantly affect infants and young children.

On average, 73 people are diagnosed with AT/RT in the USA each year. However, AT/RT is the most common central nervous system tumour in children under one years old and accounts for 40-50% of diagnoses in this age group. The prognosis for AT/RT patients is grim, with a postoperative median survival of only 11-24 months.

The collaborative study conducted by Tampere University and Tampere University Hospital examined how aberrant DNA methylation distorts cellular developmental trajectories and thereby contributes to the formation of AT/RT. DNA methylation is a normal process of controlling expression whereby methyl groups are added to the DNA strand, adding epigenetic information.

The new study showed that DNA methylation interferes with the activity of multiple regulators, which usually regulate the differentiation and maturation of central nervous system cells during brain development. Disrupted cell differentiation promotes the abnormal, uncontrolled proliferation of cells that eventually form a tumour.

The study also found several genes that regulate cell differentiation or inhibit tumour development and are silenced in AT/RT together with increased DNA methylation.

“These results will provide deeper insights into the development of AT/RTs and their malignancy. In the future, the results will help to accelerate the discovery of new treatments for this aggressive brain tumour,” says senior author Docent Kirsi Rautajoki from Tampere University.

Source: Tampere University

Scientists Evaluate Old Epilepsy Drug for Glioma Prevention

Photo by Anna Shvets on Pexels

A drug used to treat children with epilepsy prevents brain tumour formation and growth in two mouse models of neurofibromatosis type 1 (NF1), according to a study by researchers at Washington University School of Medicine in St. Louis. NF1 is a genetic condition that causes tumours to grow on nerves throughout the body.

The findings lay the groundwork for a clinical trial to assess whether the drug, lamotrigine, can prevent or delay brain tumours in children with NF1. The study is published online in the journal Neuro-Oncology.

“Based on these data, the Neurofibromatosis Clinical Trials Consortium is considering launching a first-of-its-kind prevention trial,” said senior author David H. Gutmann, MD, PhD, professor of neurology. “The plan is to enrol kids without symptoms, treat them for a limited time, and then see whether the number of children who develop tumours that require treatment goes down.

“This is a novel idea, so we took it to an NF1 patient focus group,” Gutmann continued. “They said, ‘This is exactly what we’re looking for.’ A short-term treatment with a drug that has been used safely for 30 years was acceptable to them if it reduced the chance their children would develop tumours and need chemotherapy that might have all kinds of side effects.”

Optic gliomas, tumours on the optic nerve are the most serious type that those with NF1 get. Such tumours typically appear between ages 3 to 7. Though rarely fatal, they cause vision loss in up to a third of patients as well as other symptoms, including early puberty. Standard chemotherapy for optic gliomas is only moderately effective at preventing further vision loss and can affect children’s developing brains, resulting in cognitive and behavioural problems.

In a previous study, Gutmann and Corina Anastasaki, PhD, an assistant professor of neurology and the first author on the new paper, showed that lamotrigine stopped optic glioma growth in NF1 mice by suppressing neuronal hyperactivity. Intrigued, the Neurofibromatosis Clinical Trial Consortium asked Gutmann and Anastasaki to clarify the connection between NF1 mutation, neuronal excitability and optic gliomas; assess whether lamotrigine was effective at the doses already proven safe in children with epilepsy; and conduct these studies in more than one strain of NF1 mice.

In humans, NF1 could be caused by any one of thousands of different mutations in the NF1 gene, with different mutations causing different medical problems. Repeating experiments in multiple strains of mice was a way of gauging whether lamotrigine was likely to work in people regardless of the underlying mutation.

Anastasaki and Gutmann not only showed that lamotrigine worked in two strains of NF1 mice, they also showed that the drug worked at lower doses than those used for epilepsy, meaning that it was probably safe. Even better, they found that a short course of the drug had lasting effects, both as a preventive and a treatment. Mice with tumours and that were treated for four weeks starting at 12 weeks of age saw their tumours stop growing and even showed no further damage to the retinas. Mice that received a four-week course of the drug starting at 4 weeks of age, before tumours typically emerge, showed no tumour growth even four months after treatment had ended.

These findings have led Gutmann to suggest that a one-year course of treatment for young children with NF1, maybe between the ages of 2 to 4, might be enough to reduce their risk of brain tumours.

“The idea that we might be able to change the prognosis for these kids by intervening within a short time window is so exciting,” Gutmann said. “If we could just get them past the age when these tumours typically form, past age 7, they may never need treatment. I’d love it if I never again had to discuss chemotherapy for kids who aren’t even in first grade yet.”

Source: Washington University School of Medicine

Approval for First-in Class Glioma Drug set to Change Practice

Photo by Anna Shvets on Pexels

A new drug for the treatment of a type of brain tumour that strikes young people could soon receive approval by the U.S. Food and Drug Administration. The drug, vorasidenib, could greatly extend the time before further therapy – and eventual resistance – is needed.

In an editorial in the New England Journal of Medicine, David Schiff, MD, the co-director of UVA Cancer Center’s Neuro-Oncology Center, outlines the potential significance of the drug vorasidenib for patients with most low-grade gliomas. The drug was fast-tracked by the FDA in August 2023 based on the strength of the findings, and filings for regulatory approval were made in February 2024. FDA approval is anticipated in the second half of 2024, and its approval in Europe will likely soon follow.

Adult-type diffuse gliomas represent approximately 81% of primary malignant brain tumours. Of those, approximately 20% harbour an isocitrate dehydrogenase (IDH) mutation, including 100% of grade 2 and grade 3 adult-type diffuse gliomas. Approximately 2500 Americans with a median age of only 40 are diagnosed with grade 2 IDH-mutant gliomas each year. The tumours cause steadily increasing disability, eventually become resistant to treatment options and typically prove fatal.

 Because of the limited treatment options available, doctors usually take a “watch and wait” approach to managing the brain tumours, holding off on treatment until after the tumour progresses.

In the randomised controlled INDIGO trial, 331 patients received either vorasidenib or placebo. The trial showed that the drug slowed tumour growth significantly and extended the average time until the tumour started growing from 11.1 months to more than 27 months. Vorasidenib also increased the time to next intervention (TTNI), the timeframe before patients need additional treatment such as radio- or chemotherapy. 

Schiff, in his editorial, describes the results as “striking.” Vorasidenib’s success could “put a nail in the coffin” of the watch-and-wait approach for such brain tumours, Schiff believes. 

“It used to be that we thought of all gliomas as being on a spectrum,” Schiff said. “We now understand that those with the IDH gene mutation have a markedly different biology, outcome and, as this study shows, vulnerabilities that new therapies can exploit.”

If the drug receives approval from the federal Food and Drug Administration, it would become the first targeted therapy for low-grade gliomas. But Schiff notes that there are also other recent advances that are improving our understanding of such gliomas.

“There are still many unanswered questions about how we can best utilise this new medication if and when it receives FDA approval,” Schiff said. “Nonetheless, considering that existing standard therapies for these tumours [radiation and chemotherapy] are tough on patients, with short- and long-term side effects, it will be wonderful to have a useful and very well-tolerated treatment option.”