Year: 2021

Unexpected Discovery Opens Up Stroke and Cardiac Arrest Treatments

Image source: Pixabay

In a surprising discovery, researchers at Massachusetts General Hospital (MGH) identified a mechanism that protects the brain from the effects of hypoxia. This serendipitous finding, which they report in Nature Communications, could help develop therapies for strokes, as well as brain injury resulting from cardiac arrest.

However, this study began with a very different objective, explained senior author Fumito Ichinose, MD, PhD, an attending physician in the Department of Anesthesia, Critical Care and Pain Medicine at MGH, and principal investigator in the Anesthesia Center for Critical Care Research. Ichinose and his team are developing techniques for inducing suspended animation, a state similar to hibernation where a human’s body functions are temporarily slowed or halted for later revival. 

Ichinose believes that the ability to safely induce suspended animation could have valuable medical applications, such as pausing the life processes of a patient with an incurable disease until an effective therapy is found. Often seen in science fiction, and currently studied by NASA, it could also allow humans to travel long distances in space.

A 2005 study found that inhaling a gas called hydrogen sulfide caused mice to enter a state of suspended animation. Hydrogen sulfide, which smells like rotten eggs, is sometimes called ‘sewer gas.’ Oxygen deprivation in a mammal’s brain leads to increased production of hydrogen sulfide. As this gas accumulates in the tissue, hydrogen sulfide can halt energy metabolism in neurons, causing them to die. Oxygen deprivation is a hallmark of ischaemic stroke, the most common type of stroke, and other injuries to the brain.

At first, Dr Ichinose and his team set out to study the effects of exposing mice to hydrogen sulfide repeatedly, over an extended period. At first, the mice entered a suspended-animation-like state—their body temperatures dropped and they were immobile. “But, to our surprise, the mice very quickly became tolerant to the effects of inhaling hydrogen sulfide,” said Dr Ichinose. “By the fifth day, they acted normally and were no longer affected by hydrogen sulfide.”

Interestingly, the mice that became tolerant to hydrogen sulfide were also able to tolerate severe hypoxia. Ichinose’s group suspected that enzymes in the brain that metabolise sulfide might be responsible for this. They discovered that levels of one particular enzyme, called sulfide:quinone oxidoreductase (SQOR), rose in the brains of mice when they breathed hydrogen sulfide for several days. They thus hypothesised that SQOR plays a role in resistance to hypoxia.

Nature has strong evidence for this; for example, female mammals resist hypoxia better than males—and the former have higher levels of SQOR. When SQOR levels are artificially reduced in females, their hypoxia resistance drops. (Oestrogen may be responsible for the observed increase in SQOR, as the hypoxia protection is lost when a female mammal’s estrogen-producing ovaries are removed.) Additionally, some hibernating animals, such as the thirteen-lined ground squirrel, are highly tolerant of hypoxia, which allows them to survive as their bodies’ metabolism slows down during the winter. The brain of a typical ground squirrel has 100 times more SQOR than that of a similar-sized rat. However, when the researchers ‘switched off’ expression of SQOR in the squirrels’ brains, they lost their protection against the effects of hypoxia.

Meanwhile, when the researchers artificially increased SQOR levels in the brains of mice, “they developed a robust defense against hypoxia,” explained Dr Ichinose. His team increased the level of SQOR using gene therapy, currently a technically complex, impractical approach. On the other hand, the team demonstrated that ‘scavenging’ sulfide, using an experimental drug called SS-20, reduced levels of the gas, thereby sparing the brains of mice when hypoxic.

Human brains have very low levels of SQOR, meaning that even a modest accumulation of hydrogen sulfide can be harmful, said Dr Ichinose. “We hope that someday we’ll have drugs that could work like SQOR in the body,” he says, noting that his lab is studying SS-20 and several other candidates. Such medications could be used to treat ischemic strokes, as well as patients who have suffered cardiac arrest, which can lead to hypoxia. Dr Ichinose’s lab is also investigating how hydrogen sulfide affects other parts of the body. For example, hydrogen sulfide is known to accumulate in other conditions, such as certain types of Leigh syndrome, a rare but severe neurological disorder usually leading to early death. “For some patients,” said Dr Ichinose, “treatment with a sulfide scavenger might be lifesaving.”

Source: Medical Xpress

Journal information: Eizo Marutani et al, Sulfide catabolism ameliorates hypoxic brain injury, Nature Communications (2021). DOI: 10.1038/s41467-021-23363-x

Number and Birth Order of Siblings Affect Risk of Cardiovascular Events

Photo by Wayne Lee-Sing on Unsplash
Photo by Wayne Lee-Sing on Unsplash

According to a large population study in Sweden, first-born children have a lower risk of cardiovascular events than brothers and sisters born later, but people who are part of a large family with many siblings have an increased risk of these events. The findings were published in the online journal BMJ Open.

While the influence of family history – that is, the health of parents and grandparents – on a person’s health, including cardiovascular risk, is well known, there is now growing interest in how the make-up of a person’s immediate family influences health.

For their large population study, the authors drew on data on 1.36 million men and 1.32 million women born between 1932 and 1960 and aged 30–58 years in 1990 from the Multiple-Generation Register in Sweden. They retrieved data from national registers on fatal and non-fatal cardiovascular and coronary events over the next 25 years.

Analysis of the data showed that first-borns had a lower risk of non-fatal cardiovascular and coronary events than siblings born later. First-born men had a higher risk of death than second and third-born siblings, while first-born women had a higher risk of death than second-born siblings, but equal to further siblings.

Looking at family size, compared with men with no siblings, men with one or two siblings had a lower risk of cardiovascular events, while those with four or more siblings had a higher risk.

Similarly, compared with men with no siblings, men with more than one sibling had a reduced risk of death, whereas those that had three or more siblings had an increased risk of coronary events.

A similar pattern was seen in women. Compared with those without siblings, women with three or more siblings had an increased risk of cardiovascular events, while those with two or more siblings had an increased risk of coronary events. Women with one or more siblings had a lower risk of death.

Since this was an observational study, it cannot establish cause. The authors also noted  some limitations, including that the Swedish registers included no information on diagnostic procedures and there no data were available concerning lifestyle factors, such as body mass index, smoking and diet.

The researchers took into account socioeconomic status, obesity, diabetes, chronic lung disease (COPD) and alcoholism and related liver disorders. They also note that some of their findings conflict with those from previous studies.

The researchers noted that, given the wide variation among countries of policies to support families and the number of children, their findings could have implications for public health.

“More research is needed to understand the links between sibling number and rank with health outcomes,” they say. “Future research should be directed to find biological or social mechanisms linking the status of being first born to lower risk of cardiovascular disease, as indicated by our observational findings.”

Source: BMJ Open

Journal information: Sibling rank and sibling number in relation to cardiovascular disease and mortality risk: a nationwide cohort study, BMJ Open (2021). DOI: 10.1136/bmjopen-2020-042881

Phase 1 Clinical Trial of a Gene Therapy for Alzheimer’s

Image source: Pixabay/CC0

Researchers at University of California San Diego School of Medicine have received a grant to conduct a first-in-human Phase 1 clinical trial of a gene therapy for treating Alzheimer’s disease (AD) or Mild Cognitive Impairment (MCI), a condition often preceding dementia.

Gene therapy is an experimental technique that uses genes or gene products for the treatment or prevention of diseases by altering the DNA of living cells. Viral vectors are commonly used to insert the DNA changes into the target cells’ nuclei, but non-viral vectors also exist though they are generally less efficient.

The clinical trial, developed by principal investigator Mark Tuszynski, MD, PhD, professor of neuroscience and director of the Translational Neuroscience Institute at UC San Diego School of Medicine, delivers the brain-derived neurotrophic factor (BDNF) gene into the brains of qualifying trial participants where it is hoped it will stimulate BDNF production in cells.

BDNF belongs to a family of growth factors (proteins) found in the brain and central nervous system that support existing neurons and promote growth and differentiation of new neurons and synapses. BDNF is particularly important in brain regions susceptible to degeneration in AD.

“We found in earlier studies that delivering BDNF to the part of the brain that is affected earliest in Alzheimer’s disease — the entorhinal cortex and hippocampus — was able to reverse the loss of connections and to protect from ongoing cell degeneration,” said Tuszynski. “These benefits were observed in aged rats, aged monkeys and amyloid mice.”

The three-year-long trial seeks to recruit 12 participants with either diagnosed AD or MCI to receive AAV2-BDNF treatment, with another 12 persons serving as a control group over that period.

This will be the first safety and efficacy assessment of AAV2-BDNF in humans. A previous gene therapy trial from 2001 to 2012 using AAV2 and a different protein called nerve growth factor (NGF) found increased growth, axonal sprouting and activation of functional markers in the brains of participants.

“The BDNF gene therapy trial in AD represents an advance over the earlier NGF trial,” said Tuszynski. “BDNF is a more potent growth factor than NGF for neural circuits that degenerate in AD. In addition, new methods for delivering BDNF will more effectively deliver and distribute it into the entorhinal cortex and hippocampus.”

Source: UC San Diego

Vaccine Flops and Shortages Leave SA with no Covax Shots

Image by Quicknews

Nearly six months after South Africa’s first procurement deal was made with the Covid-19 Vaccines Global Access (Covax) programme, but vaccine flops and shortages in supply have left South Africa empty-handed, while Covax struggles to even meet its June delivery goal.

South Africa’s vaccine rollout has been anything but smooth. The first batch of vaccines, produced by the Serum Institute of India (SII), arrived in the country on 1 February but were abandoned a week later after a study found it was ineffective against the 501Y.V2 variant. That first batch of one million doses were sold onto the African Union (AU) and the remainder of the order refunded.

The health department switched to the Johnson & Johnson (J&J) single-dose shot and vaccinated nearly half a million healthcare workers until its use was also halted over blood clot concerns. Phase 2 of the rollout is using the Pfizer vaccine. Fortunately, it has been found that it can be stored at much higher temperatures than its previous ultracold requirements, making it easier to distribute.

However, the failure to join Covax by December 2020 was an early warning sign over the government’s handling of vaccine acquisition. The Covax iniative, led by the Vaccine Alliance (Gavi) and World Health Organization (WHO) to supply vaccines to poorer nations, were expected to kickstart South Africa’s rollout.

Missing that first deadline, the health department and Solidarity Fund confirmed, on 22 December 2020, that a down payment of R283 million had been made to secure doses through Covax.

Vaccine flip-flopping
At first, South Africa was to receive almost 2.5 million doses of AstraZeneca vaccine, but the country’s decision to abandon the use of AstraZeneca caused severe delays. The country’s allocated AstraZeneca doses were taken back into the Covax programme.

“South Africa was allocated 2 426 400 doses of the AstraZeneca vaccine… it has requested to be allocated another vaccine in place of AZ, and will receive allocations of alternative vaccines instead,” Gavi spokesperson Evan O’Connell told Business Insider South Africa.

“It has already been allocated, at this stage, 1,392,300 doses of the Pfizer vaccine, allocated for Q2 2021.”

According to Covax’s first-round schedule,  South Africa was due to receive 117 000 Pfizer doses before April. But Covax’s deliveries are falling behind, putting initiative’s ability to meet its second quarter target.

On 17 May UNICEF Executive Director, Henrietta Fore announced that the Covax facility would shortly have delivered 65 million doses, which should have been 170 million doses by that time.

“By the time G7 leaders gather in the UK next month, and as a deadly second wave of COVID will likely continue to sweep across India and many of its South Asian neighbours, the shortfall will near 190 million doses.”

Covax hamstrung by Indian COVID crisis

India’s COVID crisis has hamstrung Covax’s aim of delivering 237 million doses of AstraZeneca vaccine in the first half of 2021. With India having the world’s highest infection numbers and deaths since April, the SII, which produces AstraZeneca doses for Covax, announced that it would halt foreign supply until December at the earliest.

“We continue to scale up manufacturing and prioritise India,” said SII CEO Adar Poonawalla on 18 May. “We also hope to start delivering to Covax and other countries by the end of the year.”

At only 35% of its targeted vaccine deliveries, Covax is calling for renewed funding and donations from developed nations — who are also accused of hoarding vaccines. WHO director-general Tedros Adhanom Ghebreyesus, criticised wealthy nations for continuing a “scandalous inequity” on Monday.

“We need countries to donate tens of millions of doses of vaccines immediately through Covax, which is the agreed global mechanism for distributing vaccines,” stated Ghebreyesus.

“We need companies to help make donations happen fast, and to give Covax the first right of refusal on all uncommitted doses now, in 2021.”

It’s unclear whether the SII’s decision to halt its supply will result in reallocations of the Pfizer doses, on which SA is depending, and which therefore could result in further delays for its Covax-allocated doses.

Source: Business Insider

Little Traitors: Infection-Enhancing Antibodies in Severe COVID

Osaka University researchers have discovered that infection with SARS-CoV-2 results in not only the production of neutralising antibodies that prevent infection, but also of infection-enhancing antibodies.

Both neutralising antibodies that protect against infection as well as infection-enhancing antibodies that increase infectivity are produced after infection with SARS-CoV-2 by analysing antibodies from COVID patients.

Virus-specific antibodies generally are considered antiviral, playing an important role in the control of virus infections. In some cases however, the presence of specific antibodies can benefit the virus. This activity is known as antibody-dependent enhancement of virus infection, a phenomenon in which virus-specific antibodies enhance the entry of virus, and in some cases the replication of virus, into monocytes/macrophages and granulocytic cells through interaction with Fc and/or complement receptors. 

In COVID infections, antibodies that target the receptor binding site (RBD) of the SARS-CoV-2 spike protein play an important function as neutralising antibodies that suppress SARS-CoV-2 infection by preventing it from binding to the human receptor, ACE2. However, the function of antibodies against other sites of the spike protein was not known.

“We found that when infection-enhancing antibodies bind to a specific site on the spike protein of SARS-CoV-2, the antibodies directly cause a conformational change in the spike protein, resulting in the increased infectivity of SARS-CoV-2. Neutralising antibodies recognise the RBD, whereas infection-enhancing antibodies recognise specific sites of the N-terminal domain (NTD),” explained lead researcher Professor Hisashi Arase. “Furthermore, the production of infection-enhancing antibodies attenuated the ability of neutralising antibodies to prevent infection.”

The study found that patients with severe COVID produced more infection-enhancing antibodies. Non-infected individuals were also found to possibly have small amounts of infection-enhancing antibodies.

Though infection-enhancing antibodies may be involved in the development of severe disease, further research is necessary to determine whether they are in fact involved in the worsening of infection in the body.

A possible benefit would be that by analysing the antibody titer of infection-enhancing antibodies, it would be possible to see who would be prone to severe COVID. The findings are also important for the development of vaccines that do not induce the production of infection-enhancing antibodies.

“It is important to analyse not only neutralising antibodies but also infection-enhancing antibodies. In the future, it may be necessary to develop vaccines that do not induce the production of infection-enhancing antibodies, because infection-enhancing antibodies may be more effective against mutant strains in which neutralising antibodies are not sufficiently effective,” says Professor Hisashi Arase.

Source: Osaka University

Journal information: Yafei Liu et al, An infectivity-enhancing site on the SARS-CoV-2 spike protein targeted by antibodies, Cell (2021). DOI: 10.1016/j.cell.2021.05.032

Geology Helps Medicine to Understand Kidney Stones

Image by photochur from Pixabay
Geologists with the tools of their trade. Image by photochur from Pixabay 

Geology studies stones to help find minerals, predict earthquakes and more, but now their expertise has been tapped to understand kidney stones — how they form, why are some people more susceptible to them and can they be prevented?

In a new paper published in the journal Nature Reviews Urology, researchers described the geological nature of kidney stones, outlined the arc of their formation, introduced a new classification scheme and suggested possible clinical interventions.

“The process of kidney stone formation is part of the natural process of the stone formation seen throughout nature,” Illinois geology professor Bruce Fouke said. “We are bringing together geology, biology and medicine to map the entire process of kidney stone formation, step by step. With this road map in hand, more effective and targeted clinical interventions and therapies can now be developed.”

Kidney stones affect in 10 adults in their lifetime and send half a million people in the United States to emergency rooms annually, according to the National Kidney Foundation. Yet little is understood about the geology behind how kidney stones form, Fouke said.

The team’s previous  research found that kidney stones form in the same way as regular stones do: they don’t crystallise all at once, instead going through cycles of partial dissolution and reformation. Doctors had previously believed that they form suddenly and intact.

The research team described in detail the multiple phases kidney stones go through in forming, dissolving and re-forming, using high-resolution imaging technologies. Their findings defy the typical classification schemes doctors use, which are based on bulk analyses of the type of mineral and the presumed location of formation in the kidney. Instead, the researchers drew up a new classification scheme based on the phase of formation the stone is in, and the chemical processes it is undergoing.

“If we can identify these phase transformations, what makes one step to go to another and how it progresses, then perhaps we can intervene in that progression and break the chain of chemical reactions happening inside the kidney tissues before a stone becomes problematic,” said lead author Mayandi Sivaguru, assistant director of core facilities at the Carl R Woese Institute for Genomic Biology at Illinois.

One particularly revelatory finding was in the very beginnings of kidney stone formation. The stones start off as microspherules, tiny droplets of mineral, which merge to form larger crystals throughout kidney tissues. They are normally flushed out, but when they merge together and form larger stones that continue to grow, they can become excruciatingly painful and even deadly in some cases, Fouke said.

“Stone formation is part of a natural, healthy process within kidneys where these tiny mineral deposits are shuttled away and excreted from the body,” Fouke explained. “But then there is a tipping point when those same mineral deposits start to grow together too rapidly and are physically unable to leave the kidney.”

Image source: Leon Macapagal on Unsplash
An example of agate, which shows similar formation characteristics to kidney stones. Image source: Leon Macapagal on Unsplash

As the stone goes through the formation process, more microspherules merge, lose their rounded shape and transform into much larger, perfectly geometric crystals. Stones go through multiple cycles of partially dissolving—shedding up to 50% of their volume—and then growing again, creating a signature pattern of layered crystals much like those of agates, coral skeletons and hot-spring deposits seen around the world.

“Looking at a cross-section of a kidney stone, you would never guess that each of the layers was originally a bunch of little balls that lined up and coalesced. These are revolutionary new ways for us to understand how these minerals grow within the kidney and provide specific targets for stone growth prevention,” Fouke said.

The researchers listed a number of possible clinical interventions and treatment targets derived from this extra knowledge on kidney stone formation. They hope that these options can be tried out, from drug targets to changes in diet or supplements that could disrupt the cascade of kidney stone formation, Sivaguru said.

To aid in this testing, Fouke’s group developed the GeoBioCell, a microfluidic cartridge that mimics the intricate internal structures of the kidney. The team hopes the device can contribute to research as well as clinical diagnostic testing and the evaluation of potential therapies, particularly for the more than 70% of kidney stone patients with recurring stones.

“Ultimately, our vision is that every operating room would have a small geology lab attached. In that lab, you could do a very rapid diagnostic on a stone or stone fragment in a matter of minutes, and have informed and individualized treatment targets,” Fouke said.

Source: University of Illinois

Journal information: Mayandi Sivaguru et al, Human kidney stones: a natural record of universal biomineralization, Nature Reviews Urology (2021). DOI: 10.1038/s41585-021-00469-x

Nasal Spray Maximises Parkinson’s Drug Delivery

Image source: Pixabay

Scientists at the University of York have been developing a nasal spray treatment for patients with Parkinson’s disease, which helps to maximise delivery of the drug in high enough dosages to be effective.

To do so, the researchers have developed an innovative new gel that can adhere to tissue inside the nose alongside the drug levodopa, helping deliver treatment directly to the brain.

Levodopa is a dopamine precursor which is converted to dopamine in the brain, where it makes up for the deficit of dopamine-producing cells in Parkinson’s patients. Generally administered when other anti-Parkinson’s medication is no longer useful, levodopa helps treat the symptoms of the disease, particularly bradykinesia, which is the impairment of voluntary motor control and slow movements or freezing. However, in the long term levodopa becomes less effective, requiring increased doses for the same effect.

Higher dosages neededProfessor David Smith, from the University of York’s Department of Chemistry, explained: “The current drug used for Parkinson’s Disease is effective to a point, but after a long period of use the body starts to break down the drug before it gets to the brain where it is most needed.

“This means increased dosage is necessary, and in later stages, sometimes, instead of tablets, the drug has to be injected.  Investigations into nasal sprays have long been of interest as a more effective delivery because of its direct route to the brain via the nerves that service the nose, but the challenge here is to find a way of making it adhere to the nasal tissue long enough to release a good dosage of the drug.”

The researchers created a levodopa-loaded gel that could flow into the nose in a liquid form and then rapidly change to a thin layer of gel inside the nose.  The method was tested in animal models by a team at King’s College London, where levodopa was successfully released from the gel into the blood and directly to the brain.

Improved uptake

Professor Smith said: “The results indicated that the gel gave the drug better adhesion inside the nose, which allowed for better levels of uptake into both the blood and brain.”

The team is now working on incorporating these materials in nasal spray devices, to progress clinical trials in humans.  The treatment of neurodegenerative diseases such as Alzheimer’s may also benefit from this approach.

Khuloud Al-Jamal, Professor of Drug Delivery and Nanomedicine from King’s College London, said: “Not only did the gel perform better than a simple solution, but the brain uptake was better than that achieved using intravenous injection of the drug. This suggests that nasal delivery of Parkinson’s drugs using this type of gel may have clinical relevance.” 

Source: University of York

Journal information: Wang, J. T-W., et al. (2021) Enhanced Delivery of Neuroactive Drugs via Nasal Delivery with a Self-Healing Supramolecular Gel. Advanced Science. doi.org/10.1002/advs.202101058.

Risk of COVID Infection Tripled in Healthcare Workers

Photo by Alex Mecl on Unsplash

A study of healthcare workers has shown their likelihood of being infected with COVID during the pandemic was three times higher compared to the general population, with about one in five of those infected workers being asymptomatic and unaware they had COVID.

The study also shows that it was not only frontline staff who faced the higher risk, suggesting that there was transmission between staff and within the wider community. The results are published in ERJ Open Research.

However, health care workers who had been infected were very unlikely to contract COVID a second time in the following six months.

The research was led by Professor James Chalmers, a consultant respiratory physician from the University of Dundee.

“We have always believed that front line health workers face a high risk of contracting COVID and that’s why we’ve tried to ensure they have the PPE needed to protect themselves,” said Prof Chalmers. “But many questions remain about the level of this risk and what other measures we can take to protect staff and reduce transmission of the disease.”

The study recruited 2063 staff working in a wide variety of healthcare roles in the East of Scotland. Between May and September 2020, the participants had blood tests for COVID antibodies, a very accurate indication of prior COVID infection. The researchers also recorded whether any participants developed an infection in subsequent months.

The health care workers results were compared with a randomly selected control group of blood samples taken by local GPs during the same time period.

These blood tests showed that 300 (14.5%) of the healthcare workers had been infected, a rate more than triple the proportion of people infected in the local population. The highest rates of infections among the workers were found in dentistry (26%), health care assistants (23.3%) and hospital porters (22.2%). The rate among admin staff was the same as that of doctors (21.1%).

Rates among people working in areas of the hospital where COVID patients were being treated were somewhat higher than those working in non-COVID areas (17.4% vs 13.5%). However, the majority of infections were in staff who were not working directly with COVID patients, suggesting there was transmission between staff or infections acquired in the community.

Out of the 300 healthcare workers testing positive, 56 (18.7%) did not think that they had ever caught COVID and were totally asymptomatic. This is an important finding, according to the researchers, since people without symptoms are likely to go to work, potentially infecting others.

In the months following their blood tests, 39 workers developed a symptomatic COVID infection, but only one of these was a worker who had previously tested positive. This equates to an 85% risk reduction, similar to the level of protection provided by COVID vaccines.

Prof Chalmers said: “A lot of attention during the pandemic has been around PPE for doctors and nurses but we found that dentists, healthcare assistants and porters were the staff most likely to test positive.

“We continued to monitor staff for up to seven months and found that having a positive antibody test gave 85% protection against a future infection. This is really good news for people who have already had COVID-19, as it means the chances of a second infection are very low.”

The team hopes to continue the research to see how long immunity persists and how vaccination affects infections among healthcare workers.

Professor Anita Simonds, President of the European Respiratory Society and Consultant in Respiratory and Sleep Medicine at Royal Brompton Hospital, UK, was not involved in the research, offered comments.

She said: “This research shows the high levels of COVID infection among all healthcare workers, with the highest evidence of infection in dentists, healthcare assistants and porters. Staff working in critical care, who are likely to have been protected by using personal protective equipment at all times, were not disproportionately affected.

“It should be noted that among administrative staff, 21.1% were found to have been infected with COVID, indicating that all those working directly with patients, and those working in other hospital roles are at risk, and vaccination and risk assessment for appropriate levels of PPE in all these frontline groups are crucial.”

Source: European Respiratory Society

White Matter Changes Uncovered in Repeated Brain Injury

Photo by MART PRODUCTION from Pexels

A new study has uncovered insights into white matter changes that occur during chronic traumatic encephalopathy (CTE), a progressive brain disease associated with repetitive head impacts. This discovery may help in identifying new targets for therapies.

CTE been diagnosed after death in the brains of American football players and other contact sport athletes as well as members of the armed services. The disease has been identified as causing impulsivity, explosivity, depression, memory impairment and executive dysfunction.

Though much prior research focused on repetitive head trauma leading to the development of abnormal tau, this study focused on white matter changes, particularly the oligodendrocytes which myelinate nerve sheaths. The results have been published online [PDF] in the journal Acta Neuropathologica.

“Research to date has focused on the deposition of abnormal tau in the gray matter in CTE. This study shows that the white matter undergoes important alterations as well.  There is loss of oligodendrocytes and alteration of oligodendrocyte subtypes in CTE that might provide new targets for prevention and therapies,” explained corresponding author Ann McKee, MD, chief of neuropathology at VA Boston Healthcare, director of the BU CTE Center.

Dr McKee and her team isolated cellular nuclei from the postmortem dorsolateral frontal white matter in eight cases of CTE and eight matched controls. They conducted single-nucleus RNA-seq (snRNA-seq) with these nuclei, revealing transcriptomic, cell-type-specific differences between the CTE and control cases. In doing so, they discovered that the white matter in CTE had fewer oligodendrocytes and the oligodendroglial subtypes were altered compared to control tissue.

Since previous studies have largely focused on the CTE-specific tau lesion located in the cortex in the brain, these findings are particularly informative as they explain a number of features of the disease. “In comparison, the cellular death process occurring in white matter oligodendrocytes in CTE appears to be separate from the accumulation of hyperphosphorylated tau,” she said. “We know that the behavioural and mood changes that occur in CTE are not explained by tau deposition. This study suggests that white matter alterations are also important features of the disease, and future studies will determine whether these white matter changes play a role in the production of behavioral or mood symptoms in CTE, such as explosivity, violence, impulsivity, and depression.”

Source: Boston University School of Medicine

Journal information: Chancellor, K. B., et al. (2021) Altered oligodendroglia and astroglia in chronic traumatic Encephalopathy. Acta Neuropathologica. doi.org/10.1007/s00401-021-02322-2.

Telomere Length May Be Set Early in Life

Image source: Pixabay

Telomeres, the protective nucleotide end caps of chromosomes which shorten with every cell division, have been found by a new study to undergo great changes in length during the first years of life.

The length of telomeres is important in a number of age-related diseases and is also an important marker of biological age. When telomeres are completely shortened, cells become senescent and unable to divide any further to repair damage.

This study, one of the first to examine telomere length (TL) in childhood, found that the initial setting of TL during prenatal development and in the first years of life may determine one’s TL throughout childhood and potentially even into adulthood or older age. The study also finds that TL decreases most rapidly from birth to age 3, then remaining unchanged into the pre-puberty period, although on some occasions it was seen to lengthen.

Researchers at the Columbia Center for Children’s Environmental Health at Columbia University Mailman School of Public Health led the study, which followed 224 children from birth to age 9. Their findings were published in the journal Psychoneuroendocrinology.

The researchers discovered that a mother’s TL is predictive of newborn TL and tracks with her child’s TL through pre-adolescence. The reasons why some children have telomeres that shorten faster are unknown, though one explanation may be that telomeres are susceptible to environmental pollutants. It is also unknown why some children had telomeres that lengthened across the study period, a phenomenon seen in other studies.

“Given the importance of telomere length in cellular health and aging, it is critical to understand the dynamics of telomeres in childhood,” said senior author Julie Herbstman, PhD, director of CCCEH and associate professor of environmental health science at Columbia Mailman School. “The rapid rate of telomere attrition between birth and age 3 years may render telomeres particularly susceptible to environmental influences during this developmental window, potentially influencing life-long health and longevity.”

Researchers used polymerase chain reaction to measure TL in white blood cells isolated from cord blood and blood collected at ages 3, 5, 7, and 9, from 224 children. In a small group of mothers they also measured maternal TL at delivery.

The researchers said that further research is needed to understand the biological mechanisms behind the variance of TL shortening or lengthening rates in the first years of life, as well as modifiable environmental factors contributing to the shortening speed.

Source: Columbia Mailman School of Health