Category: Neurology

Not all Memories Lost to Sleep Deprivation are Gone Forever

Sleeping man
Photo by Mert Kahveci on Unsplash

Sleep deprivation is bad for memorisation, something which still doesn’t deter many med students from late night cramming. Researchers however have discovered that memories learned during sleep deprivation is not necessarily lost, it is just difficult to recall. Publishing in the journalĀ Current Biology, the researchers have found a way to make this ‘hidden knowledge’ accessible again days after studying whilst sleep-deprived using optogenetic approaches and the asthma drug roflumilast.

University of Groningen neuroscientist Robbert Havekes and his team have extensively studied how sleep deprivation affects memory processes. “We previously focused on finding ways to support memory processes during a sleep deprivation episode,” says Havekes. However, in his latest study, his team examined whether amnesia as a result of sleep deprivation was a direct result of information loss, or merely caused by difficulties retrieving information. “Sleep deprivation undermines memory processes, but every student knows that an answer that eluded them during the exam might pop up hours afterwards. In that case, the information was, in fact, stored in the brain, but just difficult to retrieve.”

Priming the hippocampus

To find out, the researchers selectively introduced optogenetic proteins into neurons that are activated during a learning experience, enabling recall of a specific experience by shining a light on the cells. “In our sleep deprivation studies, we applied this approach to neurons in the hippocampus, the area in the brain where spatial information and factual knowledge are stored,” says Havekes.

First, the genetically engineered mice were given a spatial learning task in which they had to learn the location of individual objects, a process heavily reliant on neurons in the hippocampus. The mice then had to perform this same task days later, but this time with one object moved to a new location. The mice that were deprived of sleep for a few hours before the first session failed to detect this spatial change, which suggests that they cannot recall the original object locations. “However, when we reintroduced them to the task after reactivating the hippocampal neurons that initially stored this information with light, they did successfully remember the original locations,” says Havekes. “This shows that the information was stored in the hippocampus during sleep deprivation, but couldn’t be retrieved without the stimulation.”

Memory problems

The molecular pathway set off during the reactivation is also targeted by the drug roflumilast, which is used by patients with asthma or COPD. Havekes says: “When we gave mice that were trained while being sleep deprived roflumilast just before the second test, they remembered, exactly as happened with the direct stimulation of the neurons.” Since roflumilast is approved for use in humans and can enter the brain, this may lead to testing to see if it can recover ‘lost’ memories for humans..

It might be possible to stimulate the memory accessibility in people with age-induced memory problems or early-stage Alzheimer’s disease with roflumilast,” says Havekes. “And maybe we could reactivate specific memories to make them permanently retrievable again, as we successfully did in mice.” If a subject’s neurons are stimulated with the drug while they try and ‘relive’ a memory, or revise information for an exam, this information might be reconsolidated more firmly in the brain. “For now, this is all speculation of course, but time will tell.”

Source: University of Groningen.

Small Molecule Could Restore Sight Lost to Optic Nerve Injury

Eye
Source: Daniil Kuzelev on Unsplash

Central nervous system (CNS) injuries often result in a catastrophic loss of sensory, motor and visual functions, and poses one of the most difficult medical challenges today. Neuroscientists report in PNAS that they recently identifiedĀ a small molecule that can effectively stimulate nerve regeneration and restore visual functions after optic nerve injury.

“There is currently no effective treatment available for traumatic injuries to the CNS, so there is an immediate need for potential drug to promote CNS repair and ultimately achieve full function recovery, such as visual function, in patients,” said research leader Dr Eddie Ma Chi-him at City University of Hong Kong.

Enhancing mitochondrial dynamics and motility is key for successful axon regeneration

Axons are responsible for transmitting signals between neurons and from the brain to muscles and glands. The first step for successful axon regeneration is to form active growth cones and the activation of a regrowth programme, involving the synthesis and transport of materials to regrow axons. These are all energy-demanding processes, which require the active transport of mitochondria (the powerhouse of the cell) to injured axons at the distal end.

Injured neurons therefore face special challenges that require long-distance transport of mitochondria from the soma to distal regenerating axons, where axonal mitochondria in adults are mostly stationary and local energy consumption is critical for axon regeneration.

A research team led by Dr Ma identified a therapeutic small molecule, M1, which can increase the fusion and motility of mitochondria, resulting in sustained, long-distance axon regeneration. Regenerated axons elicited neural activities in target brain regions and restored visual functions within four to six weeks after optic nerve injury in M1-treated mice.

Small molecule M1 promotes mitochondrial dynamics and sustains long-distance axon regeneration

“Photoreceptors in the eyes [retina] forward visual information to neurons in the retina. To facilitate the recovery of visual function after injury, the axons of the neurons must regenerate through the optic nerve and relay nerve impulses to visual targets in the brain via the optic nerve for image processing and formation,” explained Dr Ma.

To investigate whether M1 could promote long-distance axon regeneration after CNS injuries, the research team assessed the extent of axon regeneration in M1-treated mice four weeks after injury. Strikingly, most of the regenerating axons of M1-treated mice reached 4mm distal to the crush site (ie near optic chiasm), while no regenerating axons were found in vehicle-treated control mice. In M1-treated mice, the survival of retinal ganglion cells (RGCs, neurons that transmit visual stimuli from the eye to the brain) was significantly increased from 19% to 33% four weeks after optic nerve injury.

“This indicates that the M1 treatment sustains long-distance axon regeneration from the optic chiasm, i.e. midway between the eyes and target brain region, to multiple subcortical visual targets in the brain. Regenerated axons elicit neural activities in target brain regions and restore visual functions after M1 treatment,” Dr Ma added.

M1 treatment restores visual function

To further explore whether M1 treatment can restore visual function, the research team gave the M1-treated mice a pupillary light reflex test six weeks after the optic nerve injury. They found that the lesioned eyes of M1-treated mice restored the pupil constriction response upon blue light illumination to a level similar to that of non-lesioned eyes, suggesting that M1 treatment can restore the pupil constriction response after optic nerve injuries.

In addition, the research team assessed the response of the mice to a looming stimulus — a visually induced innate defensive response to avoid predators. The mice were placed into an open chamber with a triangular prism-shaped shelter and a rapidly expanding overhead-black circle as a looming stimulus, and their freeze and escape behaviours were observed. Half of the M1-treated mice responded to the stimulus by hiding in a shelter, showing that M1 induced robust axon regeneration to reinnervate subcortical visual target brain regions for complete recovery of their visual function.

Potential clinical application of M1 for repairing nervous system injury

The seven-year-long study highlights the potential of a readily available, non-viral therapy for CNS repair, which builds on the team’s previous research on peripheral nerve regeneration using gene therapy.

“This time we used the small molecule, M1, to repair the CNS simply by intravitreal injection into the eyes, which is an established medical procedure for patients, eg for macular degeneration treatment. Successful restoration of visual functions, such as pupillary light reflex and response to looming visual stimuli was observed in M1-treated mice four to six weeks after the optic nerve had been damaged,” said Dr Au Ngan-pan, Research Associate in the Department of Neuroscience.

The team is also developing an animal model for treating glaucoma-related vision loss using M1 and possibly other common eye diseases and vision impairments such as diabetes-related retinopathy, macular degeneration and traumatic optic neuropathy. Thus, further investigation is warranted to evaluate the potential clinical application of M1. “This research breakthrough heralds a new approach that could address unmet medical needs in accelerating functional recovery within a limited therapeutic time window after CNS injuries,” said Dr Ma.

Source: City University of Hong Kong 

Neuroimaging can’t Identify Psychiatric Disorders – Yet

MRI images of the brain
Photo by Anna Shvets on Pexels

Neuroimaging technologies hold great promise in helping clinicians link specific symptoms of mental health disorders to abnormal patterns of brain activity. But a new study published in theĀ American Journal of Psychiatry shows there are still kinks to be ironed out before doctors can translate images of the brain to psychiatric disorders such as post-traumatic stress disorder (PTSD).

Several years ago, The National Institutes of Mental Health launched a multi-billion-dollar research effort to locate biomarkers of brain activity that point to the biological roots of a host of mental health diseases, which today are typically identified by clinical evaluation of a constellation of often overlapping symptoms reported by patients.

“The idea is to forget classification of disease by symptoms and find underlying biological causes,” said Yale’s Ilan Harpaz-Rotem, professor of psychiatry and psychology and senior author of the study.

For the new study, the Yale-led team attempted to replicate the findings of an earlier nationwide neuroimaging study, in which scientists linked clusters of brain activity to a variety of outcomes among patients who had arrived at US emergency departments following traumatic events. Specifically, when researchers measured patients’ brain activity during the performance of simple tasks such as mapping responses to threats and rewards, they detected a cluster of brain activity that showed high reactivity to both threat and reward signals and seemed to predict more severe symptoms of PTSD later on.

However, when Yale researchers analysed similar neuroimaging data collected from recent trauma survivors in Israel, they were not able to replicate these findings. While they did identify the different clusters of brain activity observed in the earlier study, they found no association with prospective PTSD symptoms.

“That is not to say one set of data is right and the other is wrong, just that there is a lot of fundamental work that needs to be done to develop reliable models that could generalise across different studies,” said Yale’s Ziv Ben-Zion, a postdoctoral associate at Yale School of Medicine and the corresponding author of the study.

In fact, Yale researchers are currently working with the investigators of the original study to merge datasets “to search for common underlying patterns of brain activity associated with different responses to trauma,” Ben-Zion said.

“It took about 100 years to come up with current classifications of mental illness, but we’ve only been exploring refining psychiatric diagnoses using biomarkers for the last 10 years,” said Harpaz-Rotem. “We still have a long way to go.”

Source: Yale University

MRI Scans Reveal How Horror Movies Terrify Us

Photo by Daniel Jensen on Unsplash

Finnish researchers at the University of Turku mapped the brain activity of (un)lucky participants who watched two of the highest rated horror movies of the last 100 years.

Humans are fascinated by things that scare them, such as death-defying stunts and true crime documentaries, provided these sources of fear at a safe distance. Horror movies are no different, providing a relentless villain, such as Jason in Friday the 13th or a supernatural threat.

For their study into cinematic terror, published in the journal NeuroImage, the researchers first established the 100 best and scariest horror movies of the past century, and how they made people feel.

Unseen threats are the scariest

Firstly, 72% of people report watching at last one horror movie every 6 months, and the reasons for doing so, besides the feelings of fear and anxiety, was primarily that of excitement. Watching horror movies was also an excuse to socialise, with many people preferring to watch horror movies with others than on their own.

People found horror that was psychological in nature and based on real events the scariest, and were far more scared by things that were unseen or implied rather than what they could actually see.

“This latter distinction reflects two types of fear that people experience. The creeping foreboding dread that occurs when one feels that something isn’t quite right, and the instinctive response we have to the sudden appearance of a monster that make us jump out of our skin,” says principal investigator, Professor Lauri Nummenmaa from Turku PET Centre.

MRI reveals different types of fear

Researchers wanted to know how the brain copes with fear in response to this complicated and ever changing environment. The group had people watch two horror movies (The Conjuring 2, 2016, and Insidious, 2010; both directed by James Wan) whilst measuring neural activity in a magnetic resonance imaging scanner.

During those times when anxiety is slowly increasing, regions of the brain involved in visual and auditory perception become more active, as the need to attend for cues of threat in the environment become more important. After a sudden shock, brain activity is more evident in regions involved in emotion processing, threat evaluation, and decision making, enabling a rapid response.

However, these regions are in continuous talk-back with sensory regions throughout the movie, as if the sensory regions were preparing response networks as a scary event was becoming increasingly likely.

“Therefore, our brains are continuously anticipating and preparing us for action in response to threat, and horror movies exploit this expertly to enhance our excitement,” explains Researcher Matthew Hudson.

Source: University of Turku

Recognising a Voice is Easier with a Face

To recognise a famous voice, human brains use the same centre that is activated when the speaker’s face is presented, according to the results of an innovative neuroscience study which asked participants to identify US presidents.

The new study, published in theĀ Journal of Neurophysiology, suggests that voice and face recognition are linked even more intimately than previously thought. It offers an intriguing possibility that visual and auditory information relevant to identifying someone feeds into a common brain centre, allowing for more robust, well-rounded recognition by integrating separate modes of sensation.

“From behavioural research, we know that people can identify a familiar voice faster and more accurately when they can associate it with the speaker’s face, but we never had a good explanation of why that happens,” said senior author Taylor Abel, MD, associate professor of neurological surgery at the University of Pittsburgh School of Medicine. “In the visual cortex, specifically in the part that typically processes faces, we also see electrical activity in response to famous people’s voices, highlighting how deeply the two systems are interlinked.”

Even though the interplay between the auditory and the visual brain processing systems has been widely acknowledged and investigated by various teams of neuroscientists all over the world, those systems were traditionally thought to be structurally and spatially distinct.

Few studies have attempted to directly measure activity from the brain centre – which primarily consolidates and processes visual information – to determine whether this centre is also engaged when participants are exposed to famous voice stimuli.

Researchers recruited epilepsy patients who had been implanted with electrodes measuring brain activity to determine the source of their seizures.

Abel and his team showed five participants photographs of three US presidents – Bill Clinton, George W. Bush and Barack Obama – or played short recordings of their voices, and asked participants to identify them.

Recordings of the electrical activity from the region of the brain responsible for processing visual cues (the fusiform gyri) showed that the same region became active when participants heard familiar voices, though that response was lower in magnitude and slightly delayed.

“This is important because it shows that auditory and visual areas interact very early when we identify people, and that they don’t work in isolation,” said Abel. “In addition to enriching our understanding of the basic functioning of the brain, our study explains the mechanisms behind disorders where voice or face recognition is compromised, such as in some dementias or related disorders.”

Source: University of Pittsburgh

Newly Discovered Subarachnoidal Layer Protects the Brain

Advances in neuro-imaging and molecular biology have unearthed a subtle, previously unknown layer in the brain. As described in the journal Science, the newly discovered layer forms a previously unknown component of brain anatomy that acts as both a protective barrier and platform from which immune cells monitor the brain for infection and inflammation.

ā€œThe discovery of a new anatomic structure that segregates and helps control the flow of cerebrospinal fluid (CSF) in and around the brain now provides us much greater appreciation of the sophisticated role that CSF plays not only in transporting and removing waste from the brain, but also in supporting its immune defenses,ā€ said Maiken Nedergaard, co-director of the Center for Translational Neuromedicine at University of Rochester and the University of Copenhagen. Nedergaard and her colleagues have made significant findings in the field of neuroscience, including detailing the many critical functions of previously overlooked cells in the brain called glia and the brain’s unique process of waste removal, which the lab named the glymphatic system.

The study focuses on the series of membranes that encase the brain, creating a barrier from the rest of the body and keeping the brain bathed in CSF.  The traditional understanding of what is collectively called the meningeal layer identifies the three individual layers as dura, arachnoid, and pia matter.

 This new layer discovered by the international research team further divides the space between the arachnoid and pia layers, the subarachnoid space, into two compartments, separated by the newly described layer, which the researchers name SLYM (Subarachnoidal LYmphatic-like Membrane).  While the paper mostly describes the function of SLYM in mice, it also reports its presence in the adult human brain as well.

SLYM is a type of membrane that lines other organs in the body, including the lungs and heart, called mesothelium. These membranes typically surround and protect organs, and harbour immune cells.

The new membrane is very thin and delicate, consisting of only a few cells in thickness.  Yet SLYM is a tight barrier, allowing only very small molecules to transit and it also seems to separate ā€œcleanā€ and ā€œdirtyā€ CSF.  This last observation hints at the likely role played by SLYM in the glymphatic system, which requires a controlled flow and exchange of CSF, allowing the influx of fresh CSF while flushing the toxic proteins associated with Alzheimer’s and other neurological diseases from the central nervous system.  This discovery will help researchers more precisely understand the mechanics of the glymphatic system.

Central nervous system immune cells (indicated here expressing CD45) use SLYM as a platform close to the brain’s surface to monitor cerebrospinal fluid for signs of infection and inflammation.

The SLYM also appears important to the brain’s defences.  The central nervous system has its own native population of immune cells, and the membrane’s integrity prevents outside immune cells from entering.  In addition, the membrane appears to host its own population of central nervous system immune cells that use SLYM as an observation point close to the surface of the brain from which to scan passing CSF for signs of infection or inflammation. 

Discovery of the SLYM opens the door for further study of its role in brain disease.  For example, the researchers note that larger and more diverse concentrations of immune cells congregate on the membrane during inflammation and aging.  Furthermore, when the membrane was ruptured during traumatic brain injury, the resulting disruption in the flow of CSF impaired the glymphatic system and allowed non-central nervous system immune cells to enter the brain. 

These and similar observations suggest that diseases as diverse as multiple sclerosis, central nervous system infections, and Alzheimer’s might be triggered or worsened by abnormalities in SLYM function. They also suggest that the delivery of drugs and gene therapeutics to the brain may be impacted by SLYM, which will need to be considered as new generations of biologic therapies are being developed.

Source: University of Rochester Medical Center

Memory Loss and Confusion More Common among Middle-aged Smokers

Photo by Elsa Olofsson on Unsplash

Middle-aged smokers are much more likely to report having memory loss and confusion than nonsmokers, and the likelihood of cognitive decline is lower for those who have quit, even recently, according to a new study appearing in theĀ Journal of Alzheimer’s Disease.

The study is the first to examine the relationship between smoking and cognitive decline using a one-question self-assessment asking people if they’ve experienced worsening or more frequent memory loss and/or confusion.

The findings build on previous research that established relationships between smoking and Alzheimer’s Disease and other forms of dementia, and could point to an opportunity to identify signs of trouble earlier in life, said Jenna Rajczyk, lead author of the study.

It’s also one more piece of evidence that quitting smoking is good not just for respiratory and cardiovascular reasons, but to preserve neurological health, said Rajczyk, a PhD student in Ohio State’s College of Public Health, and senior author Jeffrey Wing, assistant professor of epidemiology.

“The association we saw was most significant in the 45–59 age group, suggesting that quitting at that stage of life may have a benefit for cognitive health,” Wing said. A similar difference wasn’t found in the oldest group in the study, which could mean that quitting earlier affords people greater benefits, he said.

Researchers used data from the 2019 Behavioral Risk Factor Surveillance System Survey to compare subjective cognitive decline (SCD) measures for current smokers, recent former smokers, and those who had quit years earlier. The analysis included 136 018 people 45 and older, and about 11% reported SCD.

The prevalence of SCD among smokers in the study was almost 1.9 times that of nonsmokers. The prevalence among those who had quit less than 10 years ago was 1.5 times that of nonsmokers. Those who quit more than a decade before the survey had an SCD prevalence just slightly above the nonsmoking group.

“These findings could imply that the time since smoking cessation does matter, and may be linked to cognitive outcomes,” Rajczyk said.

The simplicity of SCD, a relatively new measure, could lend itself to wider applications, she said.

“This is a simple assessment that could be easily done routinely, and at younger ages than we typically start to see cognitive declines that rise to the level of a diagnosis of Alzheimer’s Disease or dementia,” Rajczyk said. “It’s not an intensive battery of questions. It’s more a personal reflection of your cognitive status to determine if you’re feeling like you’re not as sharp as you once were.”

Many people don’t have access to more in-depth screenings, or to specialists, making the potential applications for measuring SCD even greater, she said.

Wing said it’s important to note that these self-reported experiences don’t amount to a diagnosis, nor do they confirm independently that a person is experiencing decline out of the normal ageing process. But, he said, they could be a low-cost, simple tool to consider employing more broadly.

Source: Ohio State University

Greater Cognitive Skills in Children who Play More Video Games

Photo by Igor Karimov on Unsplash

Analysing magnetic resonance imaging (MRI) brain scans of nearly 2000 children, researchers found children who played video games for three or more hours a day did better in cognitive skills tests involving impulse control and working memory compared to children who had never played video games. Published inĀ JAMA Network Open, this study analysed data from the ongoingĀ Adolescent Brain Cognitive DevelopmentĀ (ABCD) Study, which is supported by the and other entities of the National Institutes of Health.

ā€œThis study adds to our growing understanding of the associations between playing video games and brain development,ā€ said National Institute on Drug Abuse (NIDA) Director Nora Volkow, MD. ā€œNumerous studies have linked video gaming to behaviour and mental health problems. This study suggests that there may also be cognitive benefits associated with this popular pastime, which are worthy of further investigation.ā€

Although a number of studies have investigated the relationship between video gaming and cognitive behaviour, the neurobiological mechanisms underlying the associations are not well understood. Only a handful of neuroimaging studies have addressed this topic, and the sample sizes for those studies have been small, with fewer than 80 participants.

To address this research gap, scientists at the University of Vermont, Burlington, analysed data obtained when children entered the ABCD Study at ages 9 and 10 years old. The research team examined survey, cognitive, and brain imaging data from nearly 2000 participants from within the bigger study cohort, comparing those who reported playing no video games at all and those who reported playing video games for three hours per day or more. This threshold was selected as it exceeds theĀ American Academy of Paediatrics screen time guidelines, which recommend limiting videogames to one to two hours per day for older children. Researchers assessed their performance in two tasks that reflected the children’s ability to control impulsive behaviour and to memorise information, as well as brain activity while performing the tasks.

The researchers found that the children who reported playing video games for three or more hours per day were faster and more accurate on both cognitive tasks than those who never played. They also observed that the differences in cognitive function observed between the two groups was accompanied by differences in brain activity. Functional MRI brain scans found that children who played video games for three or more hours per day showed higher brain activity in regions of the brain associated with attention and memory than in never-gamers. At the same time, those children who played at least three hours of videogames per day showed more brain activity in frontal brain regions that are associated with more cognitively demanding tasks and less brain activity in brain regions related to vision. Ā 

The researchers think these patterns may stem from practicing tasks related to impulse control and memory while playing videogames, which can be cognitively demanding, and that these changes may lead to improved performance on related tasks. Furthermore, the comparatively low activity in visual areas among children who reported playing video games may reflect that this area of the brain may become more efficient at visual processing as a result of repeated practice through video games.

While prior studies have reported associations between video gaming and increases in depression, violence, and aggressive behaviour, this study did not find that to be the case. The three hours or more group tended to report higher mental health and behavioural issues compared to the non-gaming children, but was not statistically significant. The researchers note that this will be an important measure to continue to track and understand as the children mature.

Further, the researchers stress that this cross-sectional study does not allow for cause-and-effect analyses, and that it could be that children who are good at these types of cognitive tasks may choose to play video games. The authors also emphasise that their findings do not mean that children should spend unlimited time on their computers, mobile phones, or TVs, and that the outcomes likely depend largely on the specific activities children engage in. For instance, they hypothesise that the specific genre of video games, such as action-adventure, puzzle solving, sports, or shooting games, may have different effects for neurocognitive development, and this level of specificity on the type of video game played was not assessed by the study.

ā€œWhile we cannot say whether playing video games regularly caused superior neurocognitive performance, it is an encouraging finding, and one that we must continue to investigate in these children as they transition into adolescence and young adulthood,ā€ said Bader Chaarani, PhD, assistant professor of psychiatry at the University of Vermont and the lead author on the study. ā€œMany parents today are concerned about the effects of video games on their children’s health and development, and as these games continue to proliferate among young people, it is crucial that we better understand both the positive and negative impact that such games may have.ā€

Through the ABCD Study, researchers will be able to track these children into young adulthood, looking for gaming-related changes in cognitive skills, brain activity, behaviour, and mental health.

Source: National Institutes of Health

Improving Short Term Memory Problems – with Laser Light

Photo by Cottonbro on Pexels

UK and Chinese scientists have demonstrated that laser light therapy is effective in improving short term memory in a study published inĀ Science Advances. The innovative, non-invasive therapy could improve short term, or working memory in people by up to 25%.

The treatment, termed transcranial photobiomodulation (tPBM), is applied to the right prefrontal cortex, an area important for working memory. In their experiment, the team showed how working memory improved among research participants after several minutes of treatment. They were also able to track the changes in brain activity using electroencephalogram (EEG) monitoring during treatment and testing.

Previous studies have shown that laser light treatment will improve working memory in mice, andĀ human studiesĀ have shown tPBM treatment can improve accuracy, speed upĀ reaction timeĀ and improve high-order functions such as attention and emotion. This is the first study, however, to confirm a link between tPBM and working memory in humans.

Co-author Dongwei Li, a visiting PhD student, said, “People with conditions like ADHD (attention deficit hyperactivity disorder) or other attention-related conditions could benefit from this type of treatment, which is safe, simple and non-invasive, with noĀ side-effects.”

In the study researchers at Beijing Normal University carried out experiments with 90 male and female participants aged between 18 and 25. Participants were treated with laser light to the right prefrontal cortex at wavelengths of 1064 nm, while others were treated at a shorter wavelength, or treatment was delivered to the left prefrontal cortex. Each participant was also treated with a sham, or inactive, tPBM to rule out the placebo effect.

After tPBM treatment over 12 minutes, the participants were asked to remember the orientations or colour of a set of items displayed on a screen. The participants treated withĀ laser lightĀ to the right prefrontal cortex at 1064 nm showed clear improvements in memory over those who had received the other treatments. While participants receiving other treatment variations were about to remember between three and four of the test objects, those with the targeted treatment were able to recall between four and five objects.

Data, including from electroencephalogram (EEG) monitoring during the experiment was analysed at the University of Birmingham and showed changes in brain activity that also predicted the improvements in memory performance.

The researchers do not yet know precisely why the treatment results in positive effects on working memory, nor how long the effects will last. Further research is planned to investigate these aspects.

Professor Ole Jensen, also at the Center for Human Brain Health, said, “We need further research to understand exactly why the tPBM is having this positive effect, but it’s possible that the light is stimulating the astrocytes –the powerplants – in the nerve cells within the prefrontal cortex, and this has a positive effect on the cells’ efficiency. We will also be investigating how long the effects might last. Clearly if these experiments are to lead to a clinical intervention, we will need to see long-lasting benefits.”

Source: University of Birmingham

Scientists Unravel The Neurology Underlying Soothing Touch

Man wearing mask with headache
Source: Usman Yousaf on Unsplash

People can achieve some pain relief by rubbing or pressing a part of their body associated with the pain. Observing for the first time how this phenomenon plays out in the brains of mice, MIT scientists suggest that pain-responsive cells in the brain quiet down when these neurons also receive touch inputs.

The team’s discovery, reported in the journalĀ Science Advances, offers researchers a deeper understanding of the complicated relationship between pain and touch and could offer some insights into chronic pain in humans. ā€œWe’re interested in this because it’s a common human experience,ā€ says investigatorĀ Fan Wang. ā€œWhen some part of your body hurts, you rub it, right? We know touch can alleviate pain in this way.ā€ But, she says, the phenomenon has been very difficult for neuroscientists to study.

Modelling pain relief

The spinal cord may be where touch-mediated pain relief begins, as prior studies have found pain-responsive neurons that reduce activity in response to touch. But there have been hints that the brain was involved, too. Wang says this aspect of the response has been largely unexplored, because it can be hard to monitor the brain’s response to painful stimuli amidst all the other neural activity happening there. particularly when an animal moves.

So while her team knew that mice respond to a potentially painful stimulus on the cheek by wiping their faces with their paws, they couldn’t follow the specific pain response in the animals’ brains to see if that rubbing helped settle it down. ā€œIf you look at the brain when an animal is rubbing the face, movement and touch signals completely overwhelm any possible pain signal,ā€ Wang explains.

She and her colleagues have found a way around this obstacle. Instead of studying the effects of face-rubbing, they have focused their attention on a subtler form of touch: the gentle vibrations produced by the movement of the animals’ whiskers. Mice use their whiskers to explore, moving them back and forth in a rhythmic motion known as whisking to feel out their environment. This motion activates touch receptors in the face and sends information to the brain in the form of vibrotactile signals. The human brain receives the same kind of touch signals when a person shakes their hand as they pull it back from a painfully hot pan — another way we seek touch-mediate pain relief.

Whisking away pain

Wang and her colleagues found that this whisker movement alters the way mice respond to bothersome heat or a poke on the face – both of which usually lead to face rubbing. ā€œWhen the unpleasant stimuli were applied in the presence of their self-generated vibrotactile whisking … they respond much less,ā€ she says. Sometimes, she says, whisking animals entirely ignore these painful stimuli.

In the brain’s somatosensory cortex, where touch and pain signals are processed, the team found signalling changes that seem to underlie this effect. ā€œThe cells that preferentially respond to heat and poking are less frequently activated when the mice are whisking,ā€ Wang says. ā€œThey’re less likely to show responses to painful stimuli.ā€ Even when whisking animals did rub their faces in response to painful stimuli, the team found that neurons in the brain took longer to adopt the firing patterns associated with that rubbing movement. ā€œWhen there is a pain stimulation, usually the trajectory the population dynamics quickly moved to wiping. But if you already have whisking, that takes much longer,ā€ Wang says.

Wang notes that even in the fraction of a second before provoked mice begin rubbing their faces, when the animals are relatively still, it can be difficult to sort out which brain signals are related to perceiving heat and poking and which are involved in whisker movement. Her team developed computational tools to disentangle these, and are hoping other neuroscientists will use the new algorithms to make sense of their own data.

Whisking’s effects on pain signalling seem to depend on dedicated touch-processing circuitry that sends tactile information to the somatosensory cortex from the ventral posterior thalamus. When that pathway was blocked, whisking no longer dampened the animals’ response to painful stimuli. Now, Wang says, she and her team are eager to learn how this circuitry works with other parts of the brain to modulate the perception and response to painful stimuli.

The new findings might shed light on a condition called thalamic pain syndrome, a chronic pain disorder that can develop in patients after a stroke that affects the brain’s thalamus, says Wang. ā€œSuch strokes may impair theĀ functions of thalamic circuits that normally relay pure touch signals and dampen painful signals to the cortex.ā€

Source: MIT