Category: Neurology

Neuroscientists Regenerate Neurons in Mice with Spinal Cord Injury

Source: CC0

In a new study using mice, neuroscientists have uncovered a crucial component for restoring functional activity after spinal cord injury. In the study, published in Science, the researchers showed that re-growing specific neurons back to their natural target regions led to recovery, while random regrowth was not effective.

In a 2018 study in Naturethe team identified a treatment approach that triggers axons to regrow after spinal cord injury in rodents. But even as that approach successfully led to the regeneration of axons across severe spinal cord lesions, achieving functional recovery remained a significant challenge.

For the new study, the team of researchers from UCLA, the Swiss Federal Institute of Technology, and Harvard University aimed to determine whether directing the regeneration of axons from specific neuronal subpopulations to their natural target regions could lead to meaningful functional restoration after spinal cord injury in mice. They first used advanced genetic analysis to identify nerve cell groups that enable walking improvement after a partial spinal cord injury.

The researchers then found that merely regenerating axons from these nerve cells across the spinal cord lesion without specific guidance had no impact on functional recovery. However, when the strategy was refined to include using chemical signals to attract and guide the regeneration of these axons to their natural target region in the lumbar spinal cord, significant improvements in walking ability were observed in a mouse model of complete spinal cord injury.

“Our study provides crucial insights into the intricacies of axon regeneration and requirements for functional recovery after spinal cord injuries,” said Michael Sofroniew, MD, PhD, professor of neurobiology at the David Geffen School of Medicine at UCLA and a senior author of the new study. “It highlights the necessity of not only regenerating axons across lesions but also of actively guiding them to reach their natural target regions to achieve meaningful neurological restoration.”

The authors say understanding that re-establishing the projections of specific neuronal subpopulations to their natural target regions holds significant promise for the development of therapies aimed at restoring neurological functions in larger animals and humans. However, the researchers also acknowledge the complexity of promoting regeneration over longer distances in non-rodents, necessitating strategies with intricate spatial and temporal features. Still, they conclude that applying the principles laid out in their work “will unlock the framework to achieve meaningful repair of the injured spinal cord and may expedite repair after other forms of central nervous system injury and disease.”

Source: University of California – Los Angeles Health Sciences

New Evidence of Patients Recalling Death Experiences after Cardiac Arrest

Up to an hour after cardiac arrest, some patients revived by cardiopulmonary resuscitation (CPR) had clear memories afterward of experiencing death and had brain patterns while unconscious linked to thought and memory, report investigators in the journal Resuscitation.

In a study led by researchers at NYU Grossman School of Medicine, some survivors of cardiac arrest described lucid death experiences that occurred while they were seemingly unconscious. Despite immediate treatment, fewer than 10% of the 567 patients studied, who received CPR in the hospital, recovered sufficiently to be discharged. Of the survivors, four in 10 recalled some degree of consciousness during CPR not captured by standard measures.

The study also found that in a subset of these patients, who received brain monitoring, nearly 40% had brain activity that returned to normal, or nearly normal, from a “flatline” state, at points even an hour into CPR. As captured by EEG, the patients saw spikes in the gamma, delta, theta, alpha, and beta waves associated with higher mental function.

Survivors have long reported having heightened awareness and powerful, lucid experiences, say the study authors. These have included a perception of separation from the body, observing events without pain or distress, and a meaningful evaluation of their actions and relationships. This new work found these experiences of death to be different from hallucinations, delusions, illusions, dreams, or CPR-induced consciousness.

The study authors hypothesise that the “flatlined”, dying brain removes natural inhibitory (braking) systems. These processes, known collectively as disinhibition, may open access to “new dimensions of reality,” they say, including lucid recall of all stored memories from early childhood to death, evaluated from the perspective of morality. While no one knows the evolutionary purpose of this phenomenon, it “opens the door to a systematic exploration of what happens when a person dies.”

Senior study author Sam Parnia, MD, PhD, associate professor in the Department of Medicine at NYU Langone Health and director of critical care and resuscitation research at NYU Langone, says, “Although doctors have long thought that the brain suffers permanent damage about 10 minutes after the heart stops supplying it with oxygen, our work found that the brain can show signs of electrical recovery long into ongoing CPR. This is the first large study to show that these recollections and brain wave changes may be signs of universal, shared elements of so-called near-death experiences.”

Dr Parnia adds, “These experiences provide a glimpse into a real, yet little understood dimension of human consciousness that becomes uncovered with death. The findings may also guide the design of new ways to restart the heart or prevent brain injuries and hold implications for transplantation.”

The AWAreness during REsuscitation (AWARE)-II study followed 567 adults who suffered in-hospital cardiac arrest between May 2017 and March 2020 in the US and UK. Only hospitalised patients were enrolled to standardise the CPR and resuscitation methods used, as well as recording methods for brain activity. A subset of 85 patients received brain monitoring during CPR. Additional testimony from 126 community survivors of cardiac arrest with self-reported memories was also examined to provide greater understanding of the themes related to the recalled experience of death.

The study authors conclude that research to date has neither proved nor disproved the reality or meaning of patients’ experiences and claims of awareness in relation to death. They say the recalled experience surrounding death merits further empirical investigation and plan to conduct studies that more precisely define biomarkers of clinical consciousness and that monitor the long-term psychological effects of resuscitation after cardiac arrest.

Source: Elsevier

A New Way to Map the Human Auditory Pathway

Photo by Brett Sayles

Researchers have developed a non-invasive method for mapping the human auditory pathway, which could potentially be used as a tool to help clinicians decide the best surgical strategy for patients with profound hearing loss. The findings, published online in the journal eLife, highlight the importance of early interventions to give patients the ability to hear and understand speech, so that their auditory-language network can develop properly and their long-term outcomes are improved.

Sensorineural hearing loss (SNHL) occurs when the sensitive hair cells inside the cochlea are damaged, or when there is damage to the auditory nerve which transmits sound to the brain. A person with profound hearing loss is typically unable to hear any sounds, or at best, only very loud sounds. Congenital SNHL has increased in prevalence over the past two decades, from 1.09 to 1.7 cases per 1000 live births.

The sound of speech is carried through the brain by nerve fibres in regions known as the auditory pathway, and are processed in a region called the language network. In cases of congenital SNHL, the lack of speech inputs reaching the language network may hinder its proper development, leading to poorer spoken language skills.

Currently, the primary treatments for profound SNHL are cochlear and auditory brainstem implantation, where a device is used to stimulate the peripheral cochlea or the central cochlear nucleus, respectively. Both techniques can partially restore hearing in patients, but their language development outcomes can vary. This is especially true for patients with inner ear malformations (IEM) or cochlear nerve deficiencies (CND), which contribute to 15-39% of congenital SNHL cases.

“Where SNHL is caused by CNDs and/or IEMs, there is a great deal of uncertainty around the best method of treatment. This is due to the difficulty of assessing the condition of the cochlear nerve and distinguishing between certain types of IEM, both of which impact surgical decision making,” says senior authors Hao Wu, a professor and Chief Physician specialising in Otolaryngology at Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, China. Wu also serves as the Hospital Administrator and the Clinical and Academic Lead for the department. “We therefore need a more effective method for mapping the auditory pathway and diving deeper into how IEMs and CNDs affect the development of the auditory-language network.”

In their study, professor Wu’s team investigated the auditory and language pathways in 23 children under the age of six. They included 10 children with normal hearing, and 13 with profound SNHL. In the latter group, seven children had received cochlear implantations, two had received auditory brainstem implantations, and four were candidates for auditory brainstem implantations.

The human auditory pathway is difficult to investigate non-invasively due to its delicate and intricate subcortical structures located deep within the brain. To navigate this, the team developed a new methodology to reconstruct the pathway. First, they segmented the subcortical auditory structures using track density imaging, which are reconstructed from a specific type of MRI scan and provide much greater detail and information on the structural connectivity of the brain. This allowed them to delineate the cochlear nucleus and the superior olivary complex of the auditory pathway. They then tracked the auditory and language pathways using a neuroimaging technique called probabilistic tractography, which uses the information from an MRI scan to provide the most likely view of structural brain connectivity. Next, the team assessed the density and cross-section of the nerve fibres in the auditory and language pathways.

This combined methodology allowed them to investigate three key areas to inform surgical decision making: the condition of the nerve fibres in the auditory-language network of children with profound SNHL; the potential impact of IEMs and CNDs on the development of the network before surgical intervention; and the relationship between the pre-implant structural development of the network and the auditory-language outcomes following implantation.

The team’s observations revealed a lower nerve fibre density in children with profound SNHL, in comparison to those with normal hearing. This reduction was most pronounced in two regions of the inferior central auditory pathway, as well as the left language pathway.

In addition, the findings revealed that the language pathway is more sensitive than the central auditory system to IEMs and/or CNDs, implying that the structural development of the language pathway is more negatively impacted by the condition of the peripheral auditory structure. However, the authors caution that further study is required to validate this finding. As it is more difficult to image the central auditory pathway than the language pathway, this difference could have arisen due to the limitations of current neuroimaging technologies.

The authors say the study is also limited by a relatively small cohort of patients and an incomplete genetic dataset, so more studies with a more diverse patient population will also be needed. But with further validation, they add that the methodology could be used more widely for informing decisions in treating profound SNHL.

Source: eLife

Aripiprazole Improves Sleep in Psychiatric Disorders by Entrainment to Light/Dark Cycles

Photo by Cottonbro on Pexels

Researchers in Japan have shown that the commonly prescribed antipsychotic drug aripiprazole helps reduce sleep disruptions in patients with certain psychiatric disorders by improving their natural entrainment to light and dark cycles. Their findings are published in Frontiers in Neuroscience.

Many patients with psychiatric conditions, such as bipolar disorder and major depressive disorder, frequently experience disruptions in their sleep–wake cycles. Research has shown that the administration of aripiprazole, a commonly prescribed antipsychotic drug, alleviates the symptoms of circadian sleep disorders in these patients. This improvement may be attributed to the effects of aripiprazole on the circadian central clock, specifically the hypothalamic suprachiasmatic nucleus (SCN), which regulates various circadian physiological rhythms, including the sleep–wake cycle, in mammals. However, the precise mechanism through which aripiprazole addresses these sleep disorder symptoms remains elusive.

Researchers from the University of Tsukuba have discovered that aripiprazole can directly affect the mammalian central circadian clock; specifically, it can modulate the photic entrainment in mice. Located in the hypothalamic suprachiasmatic nucleus (SCN), the central circadian clock comprises clock neurons that synchronize with each other, maintaining a roughly 24-hour rhythm. Simultaneously, SCN is receptive to external inputs like light, aligning itself with the environmental light-dark cycle. The researchers have found that aripiprazole disrupts the synchronization among the clock neurons in the SCN, heightening the responsiveness of these neurons to light stimuli in mice. Additionally, aripiprazole influences intracellular signalling within the SCN by targeting the serotonin 1A receptor, a prominent receptor in the SCN.

These findings suggest that the efficacy of aripiprazole in alleviating circadian rhythm sleep disorder symptoms in psychiatric patients might be attributed to the modulation of the circadian clock by the drug. This study expands the potential clinical usage of aripiprazole as a treatment for circadian rhythm sleep disorders.

Source: University of Tsukuba

Twin Study Reveals Concussions from Youth Linked to Later Cognitive Decline

A study of twins who fought in World War II showed that concussion early in life is tied to having lower scores on tests of thinking and memory skills decades later as well as having more rapid decline in those scores than twins who did not have a concussion, or traumatic brain injury (TBI). The study is published in Neurology®, the medical journal of the American Academy of Neurology.

“These findings indicate that even people with traumatic brain injuries in earlier life who appear to have fully recovered from them may still be at increased risk of cognitive problems and dementia later in life,” said study author Marianne Chanti-Ketterl, PhD, MSPH, of Duke University in Durham, North Carolina. “Among identical twins, who share the same genes and many of the same exposures early in life, we found that the twin who had a concussion had lower test scores and faster decline than their twin who had never had a concussion.”

The study involved 8662 men who were World War II veterans. The participants took a test of thinking skills at the start of the study when they were an average age of 67 and then again up to three more times over 12 years. Scores for the test can range from zero to 50. The average score for all participants at the beginning of the study was 32.5 points.

A total of 25% of the participants had experienced a concussion in their life.

Twins who had experienced a concussion were more likely to have lower test scores at age 70, especially if they had a concussion where they lost consciousness or were older than 24 when they had their concussion. Those twins with traumatic brain injury with loss of consciousness, more than one traumatic brain injury and who had their injuries after age 24 were more likely to have faster cognitive decline than those with no history of traumatic brain injury.

For example, a twin who experienced a traumatic brain injury after age 24 scored 0.59 points lower at age 70 than his twin with no traumatic brain injury, and his thinking skills declined faster, by 0.05 points per year.

These results took into account other factors that could affect thinking skills, such as high blood pressure, alcohol use, smoking status and education.

“Although these effect sizes are modest, the contribution of TBI on late life cognition, in addition to numerous other factors with a detrimental effect on cognition, may be enough to trigger an evaluation for cognitive impairment,” Chanti-Ketterl said. “With the trend we are seeing with increased emergency room visits due to sports or recreation activity injuries, combined with the estimated half million members of the military who suffered a TBI between 2000 and 2020, the potential long-term impact of TBI cannot be overlooked. These results may help us identify people who may benefit from early interventions that may slow cognitive decline or potentially delay or prevent dementia.”

A limitation of the study was that traumatic brain injuries were reported by the participants, so not all injuries may have been remembered or reported accurately.

Source: American Academy of Neurology

Microplastics Rapidly Bioaccumulate Everywhere in the Body

Photo by FLY:D on Unsplash

The prevalence of microplastics in the environment is well known, along with their harm to marine organisms, but few studies have examined the potential health impacts on mammals. Now, a new study published in the International Journal of Molecular Sciences has found that in mice, the infiltration of microplastics was as widespread in the body as it is in the environment, leading to behavioural changes, especially in older test subjects.

Study leader University of Rhode Island Professor Jaime Ross and her team focused on neurobehavioural effects and inflammatory response to exposure to microplastics, as well as the accumulation of microplastics in tissues, including the brain.

“Current research suggests that these microplastics are transported throughout the environment and can accumulate in human tissues; however, research on the health effects of microplastics, especially in mammals, is still very limited,” said Ross, an assistant professor of biomedical and pharmaceutical sciences at the Ryan Institute for Neuroscience and the College of Pharmacy. “This has led our group to explore the biological and cognitive consequences of exposure to microplastics.”

Behavioural changes detected

Ross’ team exposed young and old mice to varying levels of microplastics in drinking water over the course of three weeks. They found that microplastic exposure induces both behavioural changes and alterations in immune markers in liver and brain tissues. The study mice began to exhibit behaviours akin to dementia in humans. The results were even more profound in older animals.

“To us, this was striking. These were not high doses of microplastics, but in only a short period of time, we saw these changes,” Ross said. “Nobody really understands the life cycle of these microplastics in the body, so part of what we want to address is the question of what happens as you get older. Are you more susceptible to systemic inflammation from these microplastics as you age? Can your body get rid of them as easily? Do your cells respond differently to these toxins?”

To understand the physiological systems that may be contributing to these changes in behaviour, Ross’ team investigated how widespread the microplastic exposure was in the body, dissecting several major tissues including the brain, liver, kidney, gastrointestinal tract, heart, spleen and lungs. The researchers found that the particles had begun to bioaccumulate in every organ, including the brain, as well as in bodily waste.

“Given that in this study the microplastics were delivered orally via drinking water, detection in tissues such as the gastrointestinal tract, which is a major part of the digestive system, or in the liver and kidneys was always probable,” Ross said. “The detection of microplastics in tissues such as the heart and lungs, however, suggests that the microplastics are going beyond the digestive system and likely undergoing systemic circulation. The brain blood barrier is supposed to be very difficult to permeate. It is a protective mechanism against viruses and bacteria, yet these particles were able to get in there. It was actually deep in the brain tissue.”

Possible mechanism

That brain infiltration also may cause a decrease in glial fibrillary acidic protein (called “GFAP”), a protein that supports many cell processes in the brain, results have shown. “A decrease in GFAP has been associated with early stages of some neurodegenerative diseases, including mouse models of Alzheimer’s disease, as well as depression,” Ross said. “We were very surprised to see that the microplastics could induce altered GFAP signalling.”

She intends to investigate this finding further in future work. “We want to understand how plastics may change the ability for the brain to maintain its homeostasis or how exposure may lead to neurological disorders and diseases, such as Alzheimer’s disease,” she said.

Source: University of Rhode Island

A Hidden Mathematical Rule Governs the Distribution of Neurons in the Brain

Neuron densities in cortical areas in the mammalian brain follow a consistent distribution pattern. Image: Morales-Gregorio

Human Brain Project (HBP) researchers have uncovered how neuron densities are distributed across and within cortical areas in the mammalian brain. As reported in Cerebral Cortex, they have revealed a fundamental organisational principle of cortical cytoarchitecture: the ubiquitous lognormal distribution of neuron densities.

Numbers of neurons and their spatial arrangement play a crucial role in shaping the brain’s structure and function. Yet, despite the wealth of available cytoarchitectonic data, the statistical distributions of neuron densities remain largely undescribed. This new study from the HBP at Forschungszentrum Jülich and the University of Cologne (Germany) study advances our understanding of the organisation of mammalian brains.

The team accessed 9 publicly available datasets of seven species: mouse, marmoset, macaque, galago, owl monkey, baboon and human. After analysing the cortical areas of each, they found that neuron densities within these areas follow a consistent pattern – a lognormal distribution, pointing to a fundamental organisational principle underlying the densities of neurons in the mammalian brain.

A lognormal distribution is a statistical distribution characterised by a skewed bell-shaped curve. It arises, for instance, when taking the exponential of a normally distributed variable. It differs from a normal distribution in several ways. Most importantly, the curve of a normal distribution is symmetric, while the lognormal one is asymmetric with a heavy tail.

These findings are relevant for modelling the brain accurately. “Not least because the distribution of neuron densities influences the network connectivity,” says Sacha van Albada, leader of the Theoretical Neuroanatomy group at Forschungszentrum Jülich and senior author of the paper. “For instance, if the density of synapses is constant, regions with lower neuron density will receive more synapses per neuron,” she explains. Such aspects are also relevant for the design of brain-inspired technology such as neuromorphic hardware.

“Furthermore, as cortical areas are often distinguished on the basis of cytoarchitecture, knowing the distribution of neuron densities can be relevant for statistically assessing differences between areas and the locations of the borders between areas,” van Albada adds.

These results are in agreement with the observation that surprisingly many characteristics of the brain follow a lognormal distribution. “One reason why it may be very common in nature is because it emerges when taking the product of many independent variables,” says Alexander van Meegen, joint first author of the study. In other words, the lognormal distribution arises naturally as a result of multiplicative processes, similarly to how the normal distribution emerges when many independent variables are summed.

“Using a simple model, we were able to show how the multiplicative proliferation of neurons during development may lead to the observed neuron density distributions” explains van Meegen.

According to the study, in principle, cortex-wide organisational structures might be by-products of development or evolution that serve no computational function; but the fact that the same organisational structures can be observed for several species and across most cortical areas suggests that the lognormal distribution serves some purpose.

“We cannot be sure how the lognormal distribution of neuron densities will influence brain function, but it will likely be associated with high network heterogeneity, which may be computationally beneficial,” says Aitor Morales-Gregorio, first author of the study, citing previous works that suggest that heterogeneity in the brain’s connectivity may promote efficient information transmission. In addition, heterogeneous networks support robust learning and enhance the memory capacity of neural circuits.

Source: Human Brain Project

Social Isolation Linked to Reduced Brain Volume in Older People

Photo by Kindel Media on Pexels

A study of nearly 9000 older people in Japan found that those who have little social contact with others may be more likely to have reduction of overall brain volume, and in areas of the brain affected by dementia, compared with those who have more frequent social contact. The study results were published in Neurology.

“Social isolation is a growing problem for older adults,” said study author Toshiharu Ninomiya, MD, PhD, of Kyushu University in Fukuoka, Japan. “These results suggest that providing support for people to help them start and maintain their connections to others may be beneficial for preventing brain atrophy and the development of dementia.”

The study involved 8896 people without dementia, average age 73. They had MRI brain scans and health exams, and were asked how often they were in contact with friends or relatives that did not live with them.

The people with the lowest amount of social contact had overall brain volume that was significantly lower than those with the most social contact. The total brain volume, or the sum of white and grey matter, as a percentage of the total intracranial volume, or the volume within the cranium, including the brain, meninges, and cerebrospinal fluid, was 67.3% in the lowest contact group compared to 67.8% in the highest contact group. They also had lower volumes in areas of the brain such as the hippocampus and amygdala that play a role in memory and are affected by dementia.

The researchers took into account other factors that could affect brain volume, such as age, diabetes, smoking and exercise.

The socially isolated people also had more small areas of damage in the brain, called white matter lesions, than the people with frequent social contact. The percentage of intracranial volume made up of white matter lesions was 0.30 for the socially isolated group, compared to 0.26 for the most socially connected group.

The researchers found that symptoms of depression partly explained the relationship between social isolation and brain volumes. However, symptoms of depression accounted for only 15% to 29% of the association.

“While this study is a snapshot in time and does not determine that social isolation causes brain atrophy, some studies have shown that exposing older people to socially stimulating groups stopped or even reversed declines in brain volume and improved thinking and memory skills, so it’s possible that interventions to improve people’s social isolation could prevent brain volume loss and the dementia that often follows,” Ninomiya said.

Since the study involved only older Japanese people, a limitation is that the findings may not be generalisable to people of other ethnicities and younger people.

Source: American Academy of Neurology

Liraglutide Boosts Associative Learning in People with Obesity

Photo by Patrick Fore on Unsplash

Obesity leads to altered energy metabolism and reduced insulin sensitivity of cells. The so-called “anti-obesity drugs” such as liraglutide are increasingly used to treat obesity and have caused tremendous interest, especially in the USA. Researchers in Germany have now shown in people with obesity that reduced insulin sensitivity affects learning of sensory associations. The results, published in Nature Metabolism, showed that a single dose of liraglutide was able to normalise these changes and restore the underlying brain circuit function.

The brain must be able to form associations in order to control behaviour. This involves, for example, associating a neutral external stimulus with a consequence following the stimulus. In this way, the brain learns what the implication of handling of the first stimulus are. Associative learning is the basis for forming neural connections and gives stimuli their motivational force. It is essentially controlled by a brain region called the dopaminergic midbrain. This region has many receptors for the body’s signalling molecules, such as insulin, and can thus adapt behaviour to the body’s physiological needs.

But what happens when the body’s insulin sensitivity is reduced due to obesity? Does this change brain activity, ability to learn associations and thus behaviour? Researchers at the Max Planck Institute for Metabolism Research have now measured how well the learning of associations works in participants with normal body weight (high insulin sensitivity, 30 volunteers) and in participants with obesity (reduced insulin sensitivity, 24 volunteers), and if this learning process is influenced by the anti-obesity drug liraglutide.

Low insulin sensitivity reduces the brain’s ability to associate sensory stimuli.

In the evening, they injected the participants with either the drug liraglutide or a placebo in the evening. Liraglutide is a so-called GLP-1 agonist, which activates the GLP-1 receptor in the body, stimulating insulin production and producing a feeling of satiety. It is often used to treat obesity and type 2 diabetes and is given once a day. The next morning, the subjects were given a learning task that allowed the researchers to measure how well associative learning works. They found that the ability to associate sensory stimuli was less pronounced in participants with obesity than in those of normal weight, and that brain activity was reduced in the areas encoding this learning behaviour.

After just one dose of liraglutide, participants with obesity no longer showed these impairments, and no difference in brain activity was seen between participants with normal weight and obesity. In other words, the drug returned the brain activity to the state of normal-weight subjects.

“These findings are of fundamental importance. We show here that basic behaviours such as associative learning depend not only on external environmental conditions but also on the body’s metabolic state. So, whether someone has overweight or not also determines how the brain learns to associate sensory signals and what motivation is generated. The normalisation we achieved with the drug in subjects with obesity, therefore, fits with studies showing that these drugs restore a normal feeling of satiety, causing people to eat less and therefore lose weight,” says study leader Marc Tittgemeyer from the Max Planck Institute for Metabolism Research.

“While it is encouraging that available drugs have a positive effect on brain activity in obesity, it is alarming that changes in brain performance occur even in young people with obesity without other medical conditions. Obesity prevention should play a much greater role in our healthcare system in the future. Lifelong medication is the less preferred option in comparison primary prevention of obesity and associated complications,” says Ruth Hanßen, first author of the study and a physician at the University Hospital of Cologne.

Source: Max Planck Institute for Biology of Ageing

New Regeneration Drug for Spinal Cord Injury Passes Safety Check

Researchers in the UK have evaluated a potential drug for the treatment of spinal cord injury (SCI), which could potentially regrow damaged nerves, and found it to be safe and tolerable. The results of their Phase 1 clinical trial were published in British Journal of Clinical Pharmacology and evaluated the KCL-286 drug, which activates retinoic acid receptor beta (RARb) in the spine to promote recovery.

There are no licensed drugs that can fix the adult central nervous system’s inability to regenerate. Implants have been able to restore some function, but for most, spinal cord injuries are life-changing.

Previous studies have shown that nerve growth can be stimulated by activating the RARb2 receptor, but no drug suitable for humans has been developed. KCL-286, an RARb2 agonist, was developed by Professor Corcoran and team and used in a first in man study to test its safety in humans.

The study by the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s College London, recruited 109 healthy males in a single ascending dose (SAD) adaptive design with a food interaction (FI) arm, and multiple ascending dose (MAD) arm. Participants in each arm were further divided into different dose treatments.

SAD studies are designed to establish the safe dosage range of a medicine by providing participants with small doses before gradually increasing the dose provided. Researchers look for any side effects, and measure how the medicine is processed within the body. MAD studies explore how the body interacts with repeated administration of the drug, and investigate the potential for a drug to accumulate within the body.

Researchers found that participants were able to safely take 100mg doses of KCL-286, with no severe adverse events.

Professor Jonathan Corcoran, Professor of Neuroscience and Director of the Neuroscience Drug Discovery Unit, at King’s IoPPN and the study’s senior author said, “This represents an important first step in demonstrating the viability of KCL-286 in treating spinal cord injuries. This first-in-human study has shown that a 100mg dose delivered via a pill can be safely taken by humans. Furthermore, we have also shown evidence that it engages with the correct receptor.

“Our focus can hopefully now turn to researching the effects of this intervention in people with spinal cord injuries.”

Dr Bia Goncalves, a senior scientist and project manager of the study, and the study’s first author from King’s IoPPN said, “Spinal Cord Injuries are a life changing condition that can have a huge impact on a person’s ability to carry out the most basic of tasks, and the knock-on effects on their physical and mental health are significant.

“The outcomes of this study demonstrate the potential for therapeutic interventions for SCI, and I am hopeful for what our future research will find.”

The researchers are now seeking funding for a Phase 2a trial studying the safety and tolerability of the drug in those with SCI.

Source: King’s College London