Year: 2022

Probing why Vaccine Responses Vary among Individuals

Photo by Gustavo Fring at Pexels

Many factors dictate whether a vaccine provokes an immune response, including specific biomarkers within a person’s immune system, but until now there has been no evidence showing whether these factors were universal across a wide range of vaccines.

New findings from a meta-analysis published in Nature Immunology examine the biological mechanisms responsible for why some people’s immune systems respond differently to vaccinations, which could have global implications for the development and administration of vaccines.

As part of a series of studies for The Human Immunology Project Consortium (HIPC), a network of national research institutions studying the range of responses to different infections and vaccinations, Emory researchers analysed the molecular characteristics of 820 healthy young adults who were immunised with 13 different vaccines to identify specific biomarkers that generate antibody response to vaccines.

The participants were separated into three endotypes, or groups with a common gene expression, based on the level of inflammatory response prior to vaccination – a high inflammatory group, a low inflammatory group, and a mid-inflammatory group. After studying the immunological changes that occurred in participants following vaccination, researchers found the group that had the highest levels of inflammation prior to vaccine had the strongest antibody response.

“We were surprised because inflammation is usually depicted as something that is bad,” says Slim Fourati, PhD, bioinformatic research associate at Emory University and first author on the paper. “These data indicate that some types of inflammation can actually foster a stronger response from a vaccine.”

Fourati, Dr Rafick-Pierre Sekaly, professor and senior author of the paper, and the HIPC team identified specific biomarkers among this group and cellular features that characterised the pre-vaccination inflammatory signature, information that can be used to predict how well an individual will respond to a vaccine.

“With the knowledge we now have about what characteristics of the immune system enable a more robust response, vaccines can be tailored to induce this response and maximize their effectiveness,” says Fourati. “But we still have more questions to answer.”

More research is needed to determine the cause of this inflammation in otherwise healthy adults. Additionally, Fourati suggests future studies should look at how these biomarkers facilitate vaccine protection in older age groups and among populations who are immunocompromised.

These findings can serve to improve vaccine response across all individuals. Better understanding of how various pre-vaccine immune states impact antibody responses opens the possibility of altering these states in more vulnerable individuals. For example, scientists may give patients predicted to have a weaker immune response an adjuvant with the vaccine to trigger the inflammatory genes associated with greater protection.

This work will help enable improved, more efficient clinical trials for the development of new vaccines.

Source: Emory Health Sciences

Study Reveals a Possible Secret to Viral Infection Resistance in Humans

Colourised scanning electron microscope image of a natural killer cell. Credit: National Institutes of Health

Studying resistance to viral infections in humans is difficult because it’s virtually impossible to disentangle resisting being infected from simply not being exposed. By studying women who were accidentally exposed to hepatitis C (HCV) over 40 years ago, scientists in Ireland have uncovered a secret that may explain why some people are able to resist viral infections.

The extraordinary work, published in Cell Reports Medicine, has wide-ranging implications from improving our fundamental understanding of viral resistance to the potential design of therapies to treat infected people.

From 1977–79, several thousand women in Ireland were exposed to the hepatitis C virus through contaminated anti-D, a medication made using plasma from donated blood and given to Rhesus negative women who are pregnant with a Rhesus positive foetus. The medication prevents the development of antibodies that could be dangerous in subsequent pregnancies. Some of the anti-D used during the 1977–79 period was contaminated with hepatitis C.

Infected women fell into three groups: those who were chronically infected; those who cleared the infection with an antibody response; and those who appeared protected against infection but yet produced no antibodies against hepatitis C.

“We hypothesised that women who seemed to resist HCV infection must have an enhanced innate immune response, which is the ancient part of the immune system that acts as a first line of defence,” said senior author Cliona O’Farrelly, Professor of Comparative Immunology in Trinity’s School of Biochemistry and Immunology.

“To test this we needed to make contact with women exposed to the virus over forty years ago and ask them to help us by allowing us to study their immune systems to hunt for scientific clues that would explain their differing responses.

“After a nationwide campaign over 100 women came forward and we have gained some unique and important insights. That so many women – many of whom have lived with medical complications for a long time – were willing to help is testament to how much people want to engage with science and help pursue research with the potential to make genuine, positive impacts on society. We are deeply grateful to them.”

The scientists ultimately recruited almost 40 women from the resistant group, alongside 90 women who were previously infected.

In collaboration with the Institut Pasteur in Paris they then invited almost 20 women in each group to donate a blood sample that they stimulated with molecules that mimic viral infection and lead to activation of the innate immune system.

“By comparing the response of the resistant women to those who became infected, we found that resistant donors had an enhanced type I interferon response after stimulation,” said first author Jamie Sugrue, PhD Candidate. Type I interferons are a key family of antiviral immune mediators that play an important role in defence against viruses including hepatitis C and SARS-CoV-2, or COVID.

“We think that the increased type I interferon production by our resistant donors, seen now almost 40 years after the original exposure to hepatitis C, is what protected them against infection.

“These findings are important as resistance to infection is very much an overlooked outcome following viral outbreak, primarily because identifying resistant individuals is very difficult – since they do not become sick after viral exposure, they wouldn’t necessarily know that they were exposed. That’s why cohorts like this, though tragic in nature, are so valuable – they provide a unique opportunity to study the response to viral infections in an otherwise healthy population.”

The lab’s efforts are now focused on leveraging these biological findings to unpick the genetics of viral resistance in the HCV donors. Their work on HCV resistance has already helped ignite international interest in resistance to other viral infections such as SARS-CoV-2.

The O’Farrelly lab has expanded its search for virus-resistant individuals by joining in the COVID human genetic effort and by recruiting members of the public who have been heavily exposed to SARS-CoV-2 but never developed an infection.

Source: Trinity College Dublin

A Shot of Vitamin C Gives Dendritic Cells a Potent Cancer-fighting Boost

Vitamin C pills and orange
Photo by Diana Polekhina on Unsplash

New research published in Nucleic Acids Research has shown that vitamin C improves the immunogenic properties of dendritic cells, activating genes involved in the immune response. This discovery could help the development of potent new dendritic cell-based immunotherapies.

Since the onset of anticancer cell therapies, many types of immune cells have been used. The best-known of these cell therapies use lymphocytes, as in the highly successful CAR-T therapies. Recently, researchers have to turned to dendritic cells, known as the ‘master regulators of the immune system‘, for their ability to uptake and present antigens to the T-lymphocytes and induce an antigen-specific potent immune activation. This approach entails loading dendritic cells with specific antigens to create immune memory to make dendritic cell (DC)-vaccines.

To study dendritic cells in the lab, researchers differentiate them from monocytes using a particular set of molecular signalling. This differentiation is accomplished through a complex set of gene activation processes in the nucleus, mostly thanks to the activity of the chromatin remodelling machinery spearheaded by the TET family of demethylases, proteins that act upon the DNA epigenetic marks.

Vitamin C was already known to interact with several TET proteins to enhance its activity, but the specific mechanism was still poorly understood in human cells. In this study, a team lead by Dr Esteban Ballestar hypothesised that treating monocytes in vitro while differentiating into dendritic cells, would help the resulting cells be more mature and active.

The results obtained show that vitamin C treatment triggers an extensive demethylation at NF- kB/p65 binding sites compared with non-treated cells, promoting the activity of genes involved in antigen presentation and immune response activation. Vitamin C was also found to increase the communication of the resulting dendritic cells with other components of the immune system and stimulates the proliferation of antigen-specific T cells.

The researchers proved that vitamin C-stimulated dendritic cells loaded with antigens specific for the SARS-CoV-2 virus were able to activate T cells in vitro more efficiently than non-treated cells.

Overall, these new findings support the hypothesis that treating monocyte-derived dendritic cells with vitamin C may help generate more effective DC-vaccines. After consolidating these results in preclinical models and, hopefully, in clinical trials, a new generation of cell therapies based on dendritic cells may be used in the clinic to fight cancer more efficiently.

Source: Josep Carreras Leukaemia Research Institute

World First Trial of Lab-grown Red Blood Cells for Transfusion

https://www.pexels.com/photo/a-close-up-shot-of-bags-of-blood-4531306/
Photo by Charlie-Helen Robinson on Pexels

In a world first, researchers have launched a clinical trial of lab-grown red blood cells for transfusion into another person. These manufactured blood cells were grown from stem cells from donors, for transfusion into volunteers in the RESTORE randomised controlled clinical trial.

If our trial is successful, it will mean that patients who currently require regular long-term blood transfusions will need fewer transfusions in future, helping transform their care

Professor Cedric Ghevaert, chief investigator

If the technique is proven safe and effective, manufactured blood cells could in time revolutionise treatments for people with blood disorders such as sickle cell and rare blood types. It can be difficult to find enough well-matched donated blood for some people with these disorders.

To produce the lab-grown blood cells, stem cells are first magnetically extracted from a normal 470ml blood donation. These stem cells are then coaxed into becoming red blood cells. Over the three week process, an initial pool of about half a million stem cells generates 50 billion red blood cells.

Chief Investigator Professor Cedric Ghevaert, Professor in Transfusion Medicine and Consultant Haematologist at the University of Cambridge and NHS Blood and Transplant, said: “We hope our lab grown red blood cells will last longer than those that come from blood donors. If our trial, the first such in the world, is successful, it will mean that patients who currently require regular long-term blood transfusions will need fewer transfusions in future, helping transform their care.”

Professor Ashley Toye, Professor of Cell Biology at the University of Bristol and Director of the NIHR Blood and Transplant Unit in red cell products, said: “This challenging and exciting trial is a huge stepping stone for manufacturing blood from stem cells. This is the first-time lab grown blood from an allogeneic donor has been transfused and we are excited to see how well the cells perform at the end of the clinical trial.”

The trial is studying the lifespan of the lab grown cells compared with infusions of standard red blood cells from the same donor. The lab-grown blood cells are all fresh, so the trial team expect them to perform better than a similar transfusion of standard donated red cells, which contains cells of varying ages.

Additionally, if manufactured cells last longer in the body, patients who regularly need blood may not need transfusions as often. That would reduce iron overload from frequent blood transfusions, which can lead to serious complications.

The trial is the first step towards making lab grown red blood cells available as a future clinical product. For the foreseeable future, manufactured cells could only be used for a very small number of patients with very complex transfusions needs. NHSBT continues to rely on the generosity of donors.

Co-Chief Investigator Dr Rebecca Cardigan, Head of Component Development NHS Blood and Transplant and Affiliated Lecturer at the University of Cambridge, said: “It’s really fantastic that we are now able to grow enough red cells to medical grade to allow this trial to commence. We are really looking forward to seeing the results and whether they perform better than standard red cells.”

Thus far, two people have been transfused with the lab grown red cells. They are well and healthy, and were closely monitored with no untoward side effects were reported. The amount of lab grown cells being infused varies but is around 5-10mls.

Donors were recruited from NHSBT’s blood donor base. They donated blood to the trial and stem cells were separated out from their blood. These stem cells were then grown to produce red blood cells in a laboratory at NHS Blood and Transplant’s Advanced Therapies Unit in Bristol. The recipients of the blood were recruited from healthy members of the NIHR BioResource.

A minimum of 10 participants will receive two mini transfusions at least four months apart, one of standard donated red cells and one of lab grown red cells, to see if the young lab-made red blood cells last longer than cells made in the body.

Further trials are needed before clinical use, but this research marks a significant step in using lab grown red blood cells to improve treatment for patients with rare blood types or people with complex transfusion needs.

Source: University of Cambridge

Stopping Prostate Tumours from Evading Androgen Suppression Therapy

Credit: Darryl Leja / National-Human-Genome Research Institute / National Institutes of Health

Researchers have identified an investigational therapeutic approach that could be effective against treatment-resistant prostate cancer. Results of the Phase II clinical trial performed by Cedars-Sinai Cancer investigators and published in Molecular Therapyhave led to a larger, multicentre trial that will soon be underway.

Cancer of the prostate is the second-leading cause of cancer-related death in men. Many prostate tumours are not aggressive and may require no or minimal treatment. Aggressive tumours are initially treated with surgery or radiation therapy.

In roughly a third of patients, the cancer comes back after initial treatment, said Neil Bhowmick, PhD, research scientist at Cedars-Sinai Cancer, professor of Medicine and Biomedical Sciences and senior author of the study. Those patients are usually treated with medications that suppress the actions of testosterone and other androgens, which promote prostate tumour growth.

“Patients do really well until the tumour figures a way around the androgen-suppressing therapy,” Bhowmick said. “One way that it can do this is to cause cells to make only part of the protein that the drug binds to, rendering the drug useless. The partial proteins are called splice variants.”

Through research with human cells and laboratory mice, study first author Bethany Smith, PhD, figured out that the cancer cells were using a protein called CD105 to signal to the surrounding supportive cells to make these slice variant proteins. Investigators then conducted a trial in human patients to test a drug that they hoped would keep those partial proteins from forming by inhibiting CD105.

In the trial, 9 patients whose tumours were resistant to androgen-blocking therapy continued that therapy but were also given a CD105 inhibitor called carotuximab. Forty percent of those patients experienced progression-free survival, based on radiographic imaging.

“Every single one of the patients in our trial was totally resistant to at least one androgen suppressor, and the normal course of action would be to simply try a different one or chemotherapy, which research has shown generally doesn’t stop tumor growth for more than about three months,” Bhowmick said. “Carotuximab prevented the cancer’s workaround and made the tumor sensitive to androgen-suppressing therapy.”

Importantly, Bhowmick said, carotuximab also appears to prevent androgen receptor splice variants in the supporting cells surrounding tumours, further sensitising the tumour to the androgen suppressor.

“We found that this therapy may be able to, especially in early cancers, resensitize select patients to androgen suppression. This could allow patients to avoid or delay more toxic interventions such as cytotoxic chemotherapy,” said Edwin Posadas, MD, associate professor of Medicine at Cedars-Sinai and a co-author of the study. “We also hope to find ways of predicting which patients are most likely to benefit from this approach by testing blood and tissue samples using next-generation technologies housed at Cedars-Sinai Cancer.”

Study co-author Sungyong You, PhD, director of the Urologic Oncology Bioinformatics Group, pinpointed three biomarkers that could help indicate which patients will respond to this investigational therapy, and the team will validate those markers in a new clinical trial. This will allow future studies to target patients most likely to be helped by this intervention, Bhowmick said.

Source: Cedars-Sinai Medical Center

Modern Ventilators Shown to Overstretch Lung Tissue

Source: Pixabay CC0

In pulmonary medicine, it has long been debated as to whether ventilator overstretches lung tissue, and now new research published in the American Journal of Respiratory and Critical Care Medicine has proven that they do in fact cause overstretching.

The University of California Riverside researchers showed that there were major differences between natural breathing versus the forced breathing from ventilators. These results are critical, particularly in context of the COVID pandemic and the rush to build ventilators.

“Using novel techniques, we observed that ventilators can overextend certain regions of the lungs,” said Mona Eskandari, assistant professor of mechanical engineering, who led the research. These results may explain why lung health declines for patients the longer they spend on the machines, especially in the case of disease.

Eskandari’s bMECH lab pioneered a technique to study lungs as they are made to breathe. On a custom-built ventilator designed in their lab, the researchers imitated both natural and artificial breathing. Then, they observed isolated lungs involved in both types of breathing using multiple cameras collecting fast, high-resolution images, a method called digital image correlation.

“Our setup allows us to imitate both physiological and artificial breathing on the same lung with the switch of a button,” Eskandari said. “The unique combination of our ventilator with digital image correlation gives us unprecedented insights into the way specific regions of the lungs work in concert with the whole.”

Using their innovative method to interface these two systems, UCR researchers collected evidence demonstrating that natural breathing stretches certain parts of the lung as little as 25% while those same regions stretch to as much as 60% when on a ventilator.

Scholars traditionally model the lungs like balloons, or what they refer to as thin-walled pressure vessels, where pushing air in and pulling air out are understood to be mechanically equivalent.

To explain what they observed in this study, the researchers propose moving away from thin-walled pressure vessel models and instead towards thick-walled models. Unlike thin-walled pressure vessels theory, a thick-walled model accounts for the differing levels of stress in airways resulting from ventilators pushing air in versus natural breathing, which pulls air in. This helps to explain how airways are more engaged and air is more evenly distributed in the lung during physiological breathing.

Iron lungs, the gigantic ventilators used during the late 1940s polio outbreak, acted more like a human chest cavity, expanding the lung as it naturally would. This creates a vacuum effect that pulls air into the lungs. Though this action is gentler for the lungs, these bulky systems prevented easy access to monitoring other organs in hospital care.

By contrast, modern ventilators are more portable and easier for caretakers to work with. However, they push air into the lungs that is not evenly distributed, overstretching some parts and causing a decline in lung health over time.

While it is unlikely that hospitals will return to the iron lung models, it is possible that modern machines can be altered to reduce injury.

“Now that we know about excessive strain when air is delivered to the lungs, the question for us becomes about how we can improve ventilation strategies by emulating natural breathing,” Eskandari said.

Source: University of California – Riverside

Antibiotics Reduce the Gastrointestinal Bleeding Risk of Long-term Aspirin

Bottle of pills
Source: Pixabay CC0

A major clinical trial found that the risk of gastrointestinal bleeding caused by long-term aspirin use can be reduced with a short course of antibiotics, potentially improving the safety of aspirin when used to prevent heart attacks, strokes and possibly some cancers.

The results of the HEAT (Helicobacter pylori Eradication Aspirin) trial, which was led by Professor Chris Hawkey from the University of Nottingham, are published in The Lancet.

Aspirin in low doses is a very useful preventative drug in people at high risk of strokes or heart attacks. However, on rare occasions, its blood thinning effect can provoke internal ulcer bleeding. These ulcers may be caused by Helicobacter pylori.

The STAR (Simple Trials for Academic Research) team from the University of Nottingham investigated whether a short course of antibiotics to remove these bacteria would reduce the risk of bleeding in aspirin users.

The HEAT trial, conducted in 1208 UK general practices, was a real-life study which used clinical data routinely stored in GP and hospital records, instead of bringing patients back for follow up trial visits.

The researchers recruited 30 166 who were taking aspirin. Those who tested positive for H. pylori were randomised to receive antibiotics or placebos (dummy tablets) and were followed for up to 7 years.

Over the first two and a half years, those who had antibiotic treatment were less likely to be hospitalised for ulcer bleeding than those taking placebo (6 versus 17). Protection occurred rapidly: with the placebo group, the first hospitalisation for ulcer bleeding occurred after 6 days, compared to 525 days following antibiotic treatment.

Over a longer time period, protection appeared to wane. However, the overall rate of hospitalisation for ulcer bleeding was lower than expected and this in line with other evidence that ulcer disease is on the decline. Risks for people already on aspirin are low. Risks are higher when people first start aspirin, when searching for H. pylori and treating it is probably worthwhile.

Aspirin has many benefits in terms of reducing the risk of heart attacks and strokes in people at increased risk. There is also evidence that it is able to slow down certain cancers. The HEAT trial is the largest UK-based study of its kind, and we are pleased that the findings have shown that ulcer bleeding can be significantly reduced following a one-week course of antibiotics. The long-term implications of the results are encouraging in terms of safe prescribing.

Professor Chris Hawkey, University of Nottingham’s School of Medicine and Nottingham Digestive Diseases Centre

Source: University of Nottingham

Reawakening a Foetal Gene Promotes Diabetic Wound Healing

Photo by Diana Polekhina on Unsplash

In the journal Molecular Therapy, researchers report that it may be possible to heal wounds by using a healing protein that is active in foetuses, but largely inactive in adults and absent in diabetic adults.

“We already know from previous studies at other institutions that if a foetus is wounded, it can regenerate the tissue, or repair it to be like new,” said Chandan K. Sen, PhD, at Indiana University School of Medicine. “But after birth, such regenerative wound healing ability is lost. Healing in adults is relatively inefficient often associated with undesirable scar formation.”

In the study,  the team focused on a protein called nonselenocysteine-containing phospholipid hydroperoxide glutathione peroxidase, or NPGPx. NPGPx is active in foetal tissue but becomes mostly inactive in the skin after birth.

“Nature essentially hides this foetal regenerative repair pathway in the adult body,” Sen said. “We spotted its absence, and then activated it to improve healing of diabetic wounds.”

Researchers used tissue ‘nanotransfection’ technology to deliver the NPGPx gene to the wound site. Diabetic wounds, which are complicated skin injuries in people with diabetes, are particularly difficult to treat and often lead to amputations or other complications because of how easily they can become infected.

“This is an exciting new approach to harness foetal repair mechanisms to close diabetic wounds in adults,” Sen said. “The study results show that while NPGPx has been known to be abundant in the foetal skin, but not after birth, it can be reactivated in the skin after an injury. We look forward to continued study aiming to achieve a more complete regenerative repair by improving our understanding of how NPGPx functions.”

Source: Indiana University School of Medicine

Pregnancy Permanently Alters Skeletal Composition

pregnant woman holding her belly
Source: Anna Hecker on Unsplash

Reproduction permanently alters the skeletons of females in ways not previously known, a team of anthropologists has concluded from research findings published in PLOS ONE. This discovery, based on an analysis of primates, sheds new light on how giving birth can permanently change the body.

“Our findings provide additional evidence of the profound impact that reproduction has on the female organism, further demonstrating that the skeleton is not a static organ, but a dynamic one that changes with life events,” explains Paola Cerrito, who led the research as a doctoral student in NYU’s Department of Anthropology and College of Dentistry.

Specifically, the researchers found that calcium, magnesium, and phosphorus concentrations are lower in females who have experienced reproduction. These changes are linked to giving birth itself and to lactation.

However, they caution that while other clinical studies show calcium and phosphorus are necessary for optimal bone strength, the new findings do not address overall health implications for either primates or humans. Rather, they say, the work illuminates the dynamic nature of our bones.

“A bone is not a static and dead portion of the skeleton,” notes NYU anthropologist Shara Bailey, one of the study’s authors. “It continuously adjusts and responds to physiological processes.”

It’s been long-established that menopause can have an effect on females’ bones. Less clear is how preceding life-cycle events, such as reproduction, can influence skeletal composition. To address this, the researchers studied the primary lamellar bone, the main type of bone in a mature skeleton. This aspect of the skeleton is an ideal part of the body to examine because it changes over time and leaves biological markers of these changes, allowing scientists to monitor alterations during the life span.

The researchers examined the growth rate of lamellar bone in the femora, or thigh bones, of both female and male primates who had lived at the Sabana Seca Field Station in Puerto Rico and died of natural causes. Veterinarians at the field station had monitored and recorded information on these primates’ health and reproductive history, allowing the researchers to match bone-composition changes to life events with notable precision.

Cerrito and her colleagues used electron microscopy and energy-dispersive X-ray analysis to calculate changes in concentrations of calcium, phosphorus, oxygen, magnesium, and sodium in the primates’ bones.

Their results showed different concentrations of some of these elements in females who gave birth compared males as well as females who did not give birth. Specifically, in females who gave birth, calcium and phosphorus were lower in bone formed during reproductive events. Moreover, there was a significant decline in magnesium concentration during these primates’ breastfeeding of infants.

“Our research shows that even before the cessation of fertility the skeleton responds dynamically to changes in reproductive status,” says Cerrito, now a research fellow at ETH Zurich. “Moreover, these findings reaffirm the significant impact giving birth has on a female organism – quite simply, evidence of reproduction is ‘written in the bones’ for life.”

Source: New York University

Oxygen Deficiency in Newborns may Increase Later Cardiovascular Risk

Photo by Christian Bowen on Unsplash

A population-based observational study has shown that babies suffering oxygen-deficiency complications at birth are almost twice as likely to develop cardiovascular disease in childhood or early adulthood, though such conditions are rare in youth. The findings are published in the journal The Lancet Regional Health – Europe.

According to the Karolinska Institutet researchers, the study could be the first of its kind to examine how complications related to asphyxiation at birth, which affects four million babies annually, affects the risk of cardiovascular disease later in life. Previous research has mostly concentrated on the association between asphyxia in the neonatal period and brain development.

Despite the relatively high risk, the absolute number of babies who suffer from cardiovascular disease despite asphyxiation at birth is very low. After the 30-year follow-up period, only 0.3% of those with asphyxia-related complications had a cardiovascular diagnosis, compared with 0.15% of those without complications.

Since the study was observational, the researchers are unable to establish any causality or propose any underlying mechanisms.

Largest risk increase for stroke and heart failure

The study followed over 2.8 million individuals born in Sweden between 1988 and 2018, of whom 31 419 suffered asphyxia-related complications at birth. A total of 4165 cases of cardiovascular disease were identified during the follow-up period. The increase in risk was particularly salient for stroke and heart failure, as well as for atrial fibrillation. The researchers took into account potential confounders such as birth weight and maternal lifestyle.

“Even if the absolute risk of cardiovascular disease is low at a young age, our study shows that asphyxia-related complications at birth are associated with a higher risk of cardiovascular disease later in life,” says the study’s corresponding author Neda Razaz, assistant professor at the Department of MedicineSolna, Karolinska Institutet.

Source: Karolinska Institutet