Bacteria Subtype Linked to Growth in up to 50% of Human Colorectal Cancers

Human colon cancer cells. Credit: National Cancer Institute

Researchers at Fred Hutchinson Cancer Center have found that a specific subtype of a microbe commonly found in the mouth is able to travel to the gut and grow within colorectal cancer tumours. This microbe is also a culprit for driving cancer progression and leads to poorer patient outcomes after cancer treatment.

The findings, published in Nature, could help improve therapeutic approaches and early screening methods for colorectal cancer, which is the second most common cause of cancer deaths in adults in the U.S. according to the American Cancer Society.

Examining colorectal cancer tumours removed from 200 patients, the Fred Hutch team measured levels of Fusobacterium nucleatum, a bacterium known to infect tumours. In about 50% of the cases, they found that only a specific subtype of the bacterium was elevated in the tumour tissue compared to healthy tissue.

The researchers also found this microbe in higher numbers within stool samples of colorectal cancer patients compared with stool samples from healthy people.

“We’ve consistently seen that patients with colorectal tumours containing Fusobacterium nucleatum have poor survival and poorer prognosis compared with patients without the microbe,” explained Susan Bullman, PhD, Fred Hutch cancer microbiome researcher and co-corresponding study author. “Now we’re finding that a specific subtype of this microbe is responsible for tumour growth. It suggests therapeutics and screening that target this subgroup within the microbiota would help people who are at a higher risk for more aggressive colorectal cancer.”

In the study, Bullman and co-corresponding author Christopher D. Johnston, PhD, Fred Hutch molecular microbiologist, along with the study’s first author Martha Zepeda-Rivera, PhD, a Washington Research Foundation Fellow and Staff Scientist in the Johnston Lab, wanted to discover how the microbe moves from its typical environment of the mouth to a distant site in the lower gut and how it contributes to cancer growth.

First they found a surprise that could be important for future treatments. The predominant group of Fusobacterium nucleatum in colorectal cancer tumours, thought to be a single subspecies, is actually composed of two distinct lineages known as “clades.”

“This discovery was similar to stumbling upon the Rosetta Stone in terms of genetics,” Johnston explained. “We have bacterial strains that are so phylogenetically close that we thought of them as the same thing, but now we see an enormous difference between their relative abundance in tumours versus the oral cavity.”

By separating out the genetic differences between these clades, the researchers found that the tumour-infiltrating Fna C2 type had acquired distinct genetic traits suggesting it could travel from the mouth through the stomach, withstand stomach acid and then grow in the lower gastrointestinal tract. The analysis revealed 195 genetic differences between the clades.

Then, comparing tumour tissue with healthy tissue from patients with colorectal cancer, the researchers found that only the subtype Fna C2 is significantly enriched in colorectal tumour tissue and is responsible for colorectal cancer growth.

Further molecular analyses of two patient cohorts, including over 200 colorectal tumours, revealed the presence of this Fna C2 lineage in approximately 50% of cases.

The researchers also found in hundreds of stool samples from people with and without colorectal cancer that Fna C2 levels were consistently higher in colorectal cancer.

“We have pinpointed the exact bacterial lineage that is associated with colorectal cancer, and that knowledge is critical for developing effective preventive and treatment methods,” Johnston said.

Source: Fred Hutchinson Cancer Center

Introducing Tardigrade Proteins into Human Cells can Slow Metabolism

Scanning electron micrograph of an adult tardigrade. Source: Wikimedia Commons

University of Wyoming researchers have gained further insight into how tardigrades survive extreme conditions and shown that proteins from the microscopic creatures expressed in human cells can slow down molecular processes.

This makes the tardigrade proteins potential candidates in technologies centred on slowing the aging process and in long-term storage of human cells.

The new study, published in the journal Protein Science, examines the mechanisms used by tardigrades to enter and exit from suspended animation when faced by environmental stress.

Led by Senior Research Scientist Silvia Sanchez-Martinez in the lab of UW Department of Molecular Biology Assistant Professor Thomas Boothby, the research provides additional evidence that tardigrade proteins eventually could be used to make life-saving treatments available to people where refrigeration is not possible — and enhance storage of cell-based therapies, such as stem cells.

Measuring less than half a millimetre long, tardigrades can survive being completely dried out; being frozen to just above absolute zero; heated to more than 150°C; survive radiation of several thousand times a human’s lethal dose; and even survive the vacuum of outer space.

They survive by entering a state of suspended animation called biostasis, using proteins that form gels inside of cells and slow down life processes, according to the new UW-led research.

Co-authors of the study are from institutions including the University of Bristol in the United Kingdom, Washington University in St. Louis, the University of California-Merced, the University of Bologna in Italy and the University of Amsterdam in the Netherlands.

Sanchez-Martinez, who came from the Howard Hughes Medical Institute to join Boothby’s UW lab, was the lead author of the paper.

“Amazingly, when we introduce these proteins into human cells, they gel and slow down metabolism, just like in tardigrades,” Sanchez-Martinez says.

“Furthermore, just like tardigrades, when you put human cells that have these proteins into biostasis, they become more resistant to stresses, conferring some of the tardigrades’ abilities to the human cells.”

Importantly, the research shows that the whole process is reversible: “When the stress is relieved, the tardigrade gels dissolve, and the human cells return to their normal metabolism,” Boothby says.

“Our findings provide an avenue for pursuing technologies centred on the induction of biostasis in cells and even whole organisms to slow aging and enhance storage and stability,” the researchers concluded.

Previous research by Boothby’s team showed that natural and engineered versions of tardigrade proteins can be used to stabilize an important pharmaceutical used to treat people with hemophilia and other conditions without the need for refrigeration.

Tardigrades’ ability to survive being dried out has puzzled scientists, as the creatures do so in a manner that appears to differ from a number of other organisms with the ability to enter suspended animation.

Source: University of Wyoming

Risk Factors for Faster Aging in the Brain Revealed in New Study

Source: CC0

Researchers from the Nuffield Department of Clinical Neurosciences at the University of Oxford have used data from UK Biobank participants to reveal that diabetes, traffic-related air pollution and alcohol intake are the most harmful out of 15 modifiable risk factors for dementia.

The researchers had previously identified a ‘weak spot’ in the brain, which is a specific network of higher-order regions that not only develop later during adolescence, but also show earlier degeneration in old age.

They showed that this brain network is also particularly vulnerable to schizophrenia and Alzheimer’s disease.

In this new study, published in Nature Communications, they investigated the genetic and modifiable influences on these fragile brain regions by looking at the brain scans of 40 000 UK Biobank participants aged over 45.

The researchers examined 161 risk factors for dementia, and ranked their impact on this vulnerable brain network, over and above the natural effects of age.

They classified these modifiable risk factors into 15 broad categories: blood pressure, cholesterol, diabetes, weight, alcohol consumption, smoking, depressive mood, inflammation, pollution, hearing, sleep, socialisation, diet, physical activity, and education.

Prof Gwenaëlle Douaud, who led this study, said: “We know that a constellation of brain regions degenerates earlier in aging, and in this new study we have shown that these specific parts of the brain are most vulnerable to diabetes, traffic-related air pollution – increasingly a major player in dementia – and alcohol, of all the common risk factors for dementia.”

“We have found that several variations in the genome influence this brain network, and they are implicated in cardiovascular deaths, schizophrenia, Alzheimer’s and Parkinson’s diseases, as well as with the two antigens of a little-known blood group, the elusive XG antigen system, which was an entirely new and unexpected finding.”

Prof Lloyd Elliott, a co-author from Simon Fraser University in Canada, concurs: ‘In fact, two of our seven genetic findings are located in this particular region containing the genes of the XG blood group, and that region is highly atypical because it is shared by both X and Y sex chromosomes.

This is really quite intriguing as we do not know much about these parts of the genome; our work shows there is benefit in exploring further this genetic terra incognita.’

Importantly, as Prof Anderson Winkler, a co-author from the National Institutes of Health and The University of Texas Rio Grande Valley in the US, points out: “What makes this study special is that we examined the unique contribution of each modifiable risk factor by looking at all of them together to assess the resulting degeneration of this particular brain ‘weak spot’. It is with this kind of comprehensive, holistic approach – and once we had taken into account the effects of age and sex – that three emerged as the most harmful: diabetes, air pollution, and alcohol.”

This research sheds light on some of the most critical risk factors for dementia, and provides novel information that can contribute to prevention and future strategies for targeted intervention.

Source: University of Oxford

The Digital Nurse: Redefining the Future of Healthcare in South Africa

Sandra Sampson, Director at Allmed

By Sandra Sampson, Director at Allmed

The South African healthcare landscape is undergoing a transformative shift, driven by the rapid advancement of technology. At the forefront of this change is the rise of the “digital nurse,” a testament to the increasing integration of technology into the nursing profession. This transformation is not only streamlining processes; it is addressing critical challenges like the nation’s nurse shortage while ultimately improving patient care.

Embracing convenience and accessibility

Virtual platforms have become commonplace in the nursing world, facilitating efficient and accessible professional development for nurses through online meetings, networking opportunities, and educational resources. This fosters a more connected and knowledgeable nursing community, better equipped to serve patients.

Telehealth consultations, another facet of digital nursing currently revolutionising patient care, provide convenient and accessible medical consultations from the comfort of one’s home, eliminating long wait times and unnecessary travel.

Mitigating nurse shortages and ensuring quality care

South Africa grapples with a significant nurse shortage, placing a strain on the healthcare system to which digital nursing offers a practical potential solution. By leveraging technology, nurses can effectively manage larger patient volumes, reducing the burden on the existing workforce and optimising resource allocation. Remote monitoring systems and AI-powered tools further empower nurses by providing real-time patient data and facilitating early intervention, ultimately improving the quality of care delivered.

Additionally, embracing technology ensures that patients, even in underserved areas, receive quality care. The efficiency gained through virtual platforms allows nurses to allocate their time effectively, addressing minor health concerns remotely and reducing the strain on healthcare facilities for non-emergency cases.

However, it must be pointed out that although leveraging technology allows nurses to effectively manage larger patient volumes, which can alleviate the strain on the current system, this doesn’t necessarily mean fewer nurses are needed, but rather that technology empowers existing numbers to reach a wider patient base to deliver more efficient, personalised care.

Evolving alongside technology: the digital nurse of tomorrow

As the healthcare industry embraces digital technologies, the role of the nurse will continue to expand. While traditional nursing skills will remain essential, the “digital nurse” of the future must possess additional competencies.  Acquiring proficiency in digital tools and equipment, along with the capability to interpret and analyse digital data, will be crucial for delivering effective patient care. However, the most critical attribute for the digital nurse will be the willingness to adapt and embrace constant technological advancements. This will require a mindset shift that comes with acknowledging that traditional methods might not be sufficient in the face of evolving patient needs.

The challenges and opportunities in change

While the adoption of digital nursing brings numerous benefits, challenges remain. Resistance from individuals accustomed to traditional healthcare practices is one hurdle. However, with the younger generation being more adaptable, the shift towards digital nursing is expected to gain wider acceptance as technology advances. To ensure the success of this digital-first healthcare, it will be necessary to focus our attention on upskilling, which means recognising that continuous training and development programs are vital for nurses to remain proficient in the face of change.

On the flip side, a change in perspective from nursing professionals themselves will be necessary. This means embracing a growth mindset and being open towards new technologies to adapt and thrive in the digital age. Lastly, healthcare professionals as a whole need to bear in mind that transformation is essential to meet the evolving needs of patients, which includes catering to a growing preference for digital healthcare solutions. Continuing to meet the needs of patients is the only guaranteed way for nursing professionals to ensure their relevance in the future. By embracing technology and fostering a culture of continuous learning, South Africa can empower its nurses to become the digital healthcare leaders of tomorrow.

Certain Gut Bacteria Linked to Reduced Cardiovascular Disease Risk

Gut Microbiome. Credit Darryl Leja National Human Genome Research Institute National Institutes Of Health

Changes in the gut microbiome have been implicated in a range of diseases including type 2 diabetes, obesity, and inflammatory bowel disease. Now, a team of researchers has found that microbes in the gut may affect cardiovascular disease as well. In a study published in Cell, the team has identified specific species of bacteria that consume cholesterol in the gut and may help lower cholesterol and heart disease risk in people.

Researchers at the Broad Institute of MIT and Harvard along with Massachusetts General Hospital analysed metabolites and microbial genomes from more than 1400 participants in the Framingham Heart Study, a decades-long project focused on risk factors for cardiovascular disease.

The team discovered that bacteria called Oscillibacter take up and metabolise cholesterol from their surroundings, and that people carrying higher levels of the microbe in their gut had lower levels of cholesterol. They also identified the mechanism the bacteria likely use to break down cholesterol. The results suggest that interventions that manipulate the microbiome in specific ways could one day help decrease cholesterol in people. The findings also lay the groundwork for more targeted investigations of how changes to the microbiome affect health and disease.

“Our research integrates findings from human subjects with experimental validation to ensure we achieve actionable mechanistic insight that will serve as starting points to improve cardiovascular health,” said Xavier, who is a core institute member and a professor at Harvard Medical School and Massachusetts General Hospital.

Postdoctoral researcher Chenhao Li and research scientist Martin Stražar, both in Xavier’s lab, were co-first authors on the study.

Cholesterol cues

In the past decade, other researchers have uncovered links between composition of the gut microbiome and elements of cardiovascular disease, such as a person’s triglycerides and blood sugar levels after a meal. But scientists haven’t been able to target those connections with therapies in part because they lack a complete understanding of metabolic pathways in the gut.

In the new study, the Broad team gained a more complete and detailed picture of the impact of gut microbes on metabolism. They combined shotgun metagenomic sequencing, which profiles all of the microbial DNA in a sample, with metabolomics, which measures the levels of hundreds of known and thousands of unknown metabolites. They used these tools to study stool samples from the Framingham Heart Study.

“The project outcomes underline the importance of high-quality, curated patient data,” Stražar said. “That allowed us to note effects that are really subtle and hard to measure and directly follow up on them.”

More than 16 000 associations between microbes and metabolic traits were found, one of them particularly strong: People with several species of bacteria from the Oscillibacter genus had lower cholesterol levels than those who lacked the bacteria. The researchers found that species in the Oscillibacter genus were surprisingly abundant in the gut, representing on average 1 in every 100 bacteria.

The researchers then wanted to figure out the biochemical pathway the microbes use to break down cholesterol. To do this, they first needed to grow the organism in the lab. Fortunately, the lab has spent years collecting bacteria from stool samples to create a unique library that also included Oscillibacter.

After successfully growing the bacteria, the team used mass spectrometry to identify the most likely byproducts of cholesterol metabolism in the bacteria. This allowed them to determine the pathways the bacteria uses to lower cholesterol levels. They found that the bacteria converted cholesterol into intermediate products that can then be broken down by other bacteria and excreted from the body. Next, the team used machine-learning models to identify the candidate enzymes responsible for this biochemical conversion, and then detected those enzymes and cholesterol breakdown products specifically in certain Oscillibacter in the lab.

The team found another gut bacterial species, Eubacterium coprostanoligenes, that also contributes to decreased cholesterol levels. This species carries a gene that the scientists had previously shown is involved in cholesterol metabolism. In the new work, the team discovered that Eubacterium might have a synergistic effect with Oscillibacter on cholesterol levels, which suggests that new experiments that study combinations of bacterial species could help shed light on how different microbial communities interact to affect human health.

Microbial messages

The human gut microbiome remains mostly unmapped, but the team believes they have paved the way for the discovery of other similar metabolic pathways impacted by gut microbes, which could be targeted therapeutically.

“There are many clinical studies trying to do faecal microbiome transfer studies without much understanding of how the microbes interact with each other and the gut,” Li said. “Hopefully stepping back by focusing on one particular bug or gene first, we’ll get a systematic understanding of gut ecology and come up with better therapeutic strategies like targeting one or a few bugs.”

“Because of the large number of genes of unknown function in the gut microbiome, there are gaps in our ability to predict metabolic functions,” Li added. “Our work highlights the possibility that additional sterol metabolism pathways may be modified by gut microbes. There are potentially a lot of new discoveries to be made that will bring us closer to a mechanistic understanding of how microbes interact with the host.”

Source: Broad Institute of MIT and Harvard

Wide-ranging Animal Studies Link pH Changes to Cognitive and Psychiatric Disorders

Source: CC0

A global collaborative research group has identified brain energy metabolism dysfunction leading to altered pH and lactate levels as common hallmarks in numerous animal models of neuropsychiatric and neurodegenerative disorders. These include models of intellectual disability, autism spectrum disorders, schizophrenia, bipolar disorder, depressive disorders, and Alzheimer’s disease. The findings were published in eLife.

The research group, comprising 131 researchers from 105 laboratories across seven countries, sheds light on altered energy metabolism as a key factor in various neuropsychiatric and neurodegenerative disorders. While considered controversial, an elevated lactate level and the resulting decrease in pH is now also proposed as a potential primary component of these diseases. Unlike previous assumptions associating these changes with external factors like medicationa, the research group’s previous findings suggest that they may be intrinsic to the disorders. This conclusion was drawn from five animal models of schizophrenia/developmental disorders, bipolar disorder, and autism, which are exempt from such confounding factorsb. However, research on brain pH and lactate levels in animal models of other neuropsychiatric and neurological disorders has been limited. Until now, it was unclear whether such changes in the brain were a common phenomenon. Additionally, the relationship between alterations in brain pH and lactate levels and specific behavioural abnormalities had not been clearly established.

This study, encompassing 109 strains/conditions of mice, rats, and chicks, including animal models related to neuropsychiatric conditions, reveals that changes in brain pH and lactate levels are a common feature in a diverse range of animal models of conditions, including schizophrenia/developmental disorders, bipolar disorder, autism, as well as models of depression, epilepsy, and Alzheimer’s disease. This study’s significant insights include:

I. Common Phenomenon Across Disorders: About 30% of the 109 types of animal models exhibited significant changes in brain pH and lactate levels, emphasising the widespread occurrence of energy metabolism changes in the brain across various neuropsychiatric conditions.

II. Environmental Factors as a Cause: Models simulating depression through psychological stress, and those induced to develop diabetes or colitis, which have a high comorbidity risk for depression, showed decreased brain pH and increased lactate levels. Various acquired environmental factors could contribute to these changes.

III. Cognitive Impairment Link: A comprehensive analysis integrating behavioural test data revealed a predominant association between increased brain lactate levels and impaired working memory, illuminating an aspect of cognitive dysfunction.

IV. Confirmation in Independent Cohort: These associations, particularly between higher brain lactate levels and poor working memory performance, were validated in an independent cohort of animal models, reinforcing the initial findings.

V. Autism Spectrum Complexity: Variable responses were noted in autism models, with some showing increased pH and decreased lactate levels, suggesting subpopulations within the autism spectrum with diverse metabolic patterns.

“This is the first and largest systematic study evaluating brain pH and lactate levels across a range of animal models for neuropsychiatric and neurodegenerative disorders. Our findings may lay the groundwork for new approaches to develop the transdiagnostic characterisation of different disorders involving cognitive impairment,” states Dr Hideo Hagihara, the study’s lead author.

Professor Tsuyoshi Miyakawa, the corresponding author, explains, “This research could be a stepping stone towards identifying shared therapeutic targets in various neuropsychiatric disorders. Future studies will centre on uncovering treatment strategies that are effective across diverse animal models with brain pH changes. This could significantly contribute to developing tailored treatments for patient subgroups characterized by specific alterations in brain energy metabolism.”

The exact mechanism behind the reduction in pH and the increase in lactate levels remains elusive. But the authors suggest that, since lactate production increases in response to neural hyperactivity to meet the energy demand, this might be the underlying reason.

Source: Fujita Health University

Familial Alzheimer’s Disease Transferred via Bone Marrow Transplant in Mice Experiment

Photo by Mari Lezhava on Unsplash

Familial Alzheimer’s disease can be transferred via bone marrow transplant, researchers show in the journal Stem Cell Reports. When the team transplanted bone marrow stem cells from mice carrying a hereditary version of Alzheimer’s disease into normal lab mice, the recipients developed Alzheimer’s disease – and at an accelerated rate.

The study highlights the role of amyloid that originates outside of the brain in the development of Alzheimer’s disease, which changes the paradigm of Alzheimer’s from being a disease that is exclusively produced in the brain to a more systemic disease. Based on their findings, the researchers say that donors of blood, tissue, organ, and stem cells should be screened for Alzheimer’s disease to prevent its inadvertent transfer during blood product transfusions and cellular therapies.

“This supports the idea that Alzheimer’s is a systemic disease where amyloids that are expressed outside of the brain contribute to central nervous system pathology,” says senior author and immunologist Wilfred Jefferies, of the University of British Columbia. “As we continue to explore this mechanism, Alzheimer’s disease may be the tip of the iceberg and we need to have far better controls and screening of the donors used in blood, organ and tissue transplants as well as in the transfers of human derived stem cells or blood products.”

To test whether a peripheral source of amyloid could contribute to the development of Alzheimer’s in the brain, the researchers transplanted bone marrow containing stem cells from mice carrying a familial version of the disease — a variant of the human amyloid precursor protein (APP) gene, which, when cleaved, misfolded and aggregated, forms the amyloid plaques that are a hallmark of Alzheimer’s disease. They performed transplants into two different strains of recipient mice: APP-knockout mice that lacked an APP gene altogether, and mice that carried a normal APP gene.

In this model of heritable Alzheimer’s disease, mice usually begin developing plaques at 9 to 10 months of age, and behavioural signs of cognitive decline begin to appear at 11 to 12 months of age. Surprisingly, the transplant recipients began showing symptoms of cognitive decline much earlier – at 6 months post-transplant for the APP-knockout mice and at 9 months for the “normal” mice.

“The fact that we could see significant behavioural differences and cognitive decline in the APP-knockouts at 6 months was surprising but also intriguing because it just showed the appearance of the disease that was being accelerated after being transferred,” says first author Chaahat Singh of the University of British Columbia.

In mice, signs of cognitive decline present as an absence of normal fear and a loss of short and long-term memory. Both groups of recipient mice also showed clear molecular and cellular hallmarks of Alzheimer’s disease, including leaky blood-brain barriers and buildup of amyloid in the brain.

Observing the transfer of disease in APP-knockout mice that lacked an APP gene altogether, the team concluded that the mutated gene in the donor cells can cause the disease and observing that recipient animals that carried a normal APP gene are susceptible to the disease suggests that the disease can be transferred to health individuals.

Because the transplanted stem cells were hematopoietic cells, meaning that they could develop into blood and immune cells but not neurons, the researchers’ demonstration of amyloid in the brains of APP knockout mice shows definitively that Alzheimer’s disease can result from amyloid that is produced outside of the central nervous system.

Finally the source of the disease in mice is a human APP gene demonstrating the mutated human gene can transfer the disease in a different species.

In future studies, the researchers plan to test whether transplanting tissues from normal mice to mice with familial Alzheimer’s could mitigate the disease and to test whether the disease is also transferable via other types of transplants or transfusions and to expand the investigation of the transfer of disease between species.

“In this study, we examined bone marrow and stem cells transplantation. However, next it will be important to examine if inadvertent transmission of disease takes place during the application of other forms of cellular therapies, as well as to directly examine the transfer of disease from contaminated sources, independent from cellular mechanisms,” says Jefferies.

Source: Cell Press

Eggs are not the Cholesterol Menace They were Thought to be

Photo by Annie Spratt on Unsplash

Many people hesitate to eat eggs amid concerns that they may raise cholesterol levels, with negative cardiovascular consequences. However, results from a prospective, controlled trial presented at the American College of Cardiology’s Annual Scientific Session show that over a four-month period cholesterol levels and other cardiovascular markers were similar among people who ate fortified eggs most days of the week compared with a non-egg eating control group.

A total of 140 patients with or at high risk for cardiovascular disease were enrolled in the PROSPERITY trial, which aimed to assess the effects of eating 12 or more fortified eggs a week versus a non-egg diet (consuming less than two eggs a week) on HDL- and LDL-cholesterol, as well as other key markers of cardiovascular health over a four-month study period.

“We know that cardiovascular disease is, to some extent, mediated through risk factors like high blood pressure, high cholesterol and increased BMI and diabetes. Dietary patterns and habits can have a notable influence on these and there’s been a lot of conflicting information about whether or not eggs are safe to eat, especially for people who have or are at risk for heart disease,” said Nina Nouhravesh, MD, a research fellow at the Duke Clinical Research Institute in Durham, North Carolina, and the study’s lead author. “This is a small study, but it gives us reassurance that eating fortified eggs is OK with regard to lipid effects over four months, even among a more high-risk population.”

Eggs are a common and relatively inexpensive source of protein and dietary cholesterol. Nouhravesh and her team wanted to look specifically at fortified eggs as they contain less saturated fat and additional vitamins and minerals, such as iodine, vitamin D, selenium, vitamin B2, 5 and 12, and omega-3 fatty acids.

For this study, patients were randomly assigned to eat 12 fortified eggs a week (cooked in whatever manner they chose) or to eat fewer than two eggs of any kind (fortified or not) per week.  All patients were 50 years of age or older (the average age was 66 years), half were female and 27% were Black. All patients had experienced one prior cardiovascular event or had two cardiovascular risk factors, such as high blood pressure, high cholesterol, increased BMI or diabetes. The co-primary endpoint was LDL and HDL cholesterol at four months. Secondary endpoints included lipid, cardiometabolic and inflammatory biomarkers and levels of vitamin and minerals. 

Patients had in-person clinic visits at the start of the study and visits at one and four months to take vital signs and have bloodwork done. Phone check-ins occurred at two and three months and patients in the fortified egg group were asked about their weekly egg consumption. Those with low adherence were provided additional education materials.

Results showed a -0.64mg/dL and a -3.14mg/dL reduction in HDL-cholesterol and LDL cholesterol, respectively, in the fortified egg group. While these differences weren’t statistically significant, the researchers said the differences suggest that eating 12 fortified eggs each week had no adverse effect on blood cholesterol. In terms of secondary endpoints, researchers observed a numerical reduction in total cholesterol, LDL particle number, another lipid biomarker called apoB, high-sensitivity troponin (a marker of heart damage), and insulin resistance scores in the fortified egg group, while vitamin B increased.

“While this is a neutral study, we did not observe adverse effects on biomarkers of cardiovascular health and there were signals of potential benefits of eating fortified eggs that warrant further investigation in larger studies as they are more hypothesis generating here,” Nouhravesh said, explaining that subgroup analyses revealed numerical increases in HDL cholesterol and reductions in LDL cholesterol in patients 65 years or older and those with diabetes in the fortified egg group compared with those eating fewer than two eggs.

So why have eggs gotten a bad rap? Some of the confusion stems from the fact that egg yolks contain cholesterol. Experts said a more important consideration, especially in the context of these findings, might be what people are eating alongside their eggs, such as buttered toast, bacon and other processed meats, which are not heart healthy choices. As always, Nouhravesh said it’s a good idea for people with heart disease to talk with their doctor about a heart healthy diet.

This single-centre study is limited by its small size and reliance on patients’ self-reporting of their egg consumption and other dietary patterns. It was also an unblinded study, which means patients knew what study group they were in, which can influence their health behaviours.

The study was funded by Eggland’s Best.

Source: American College of Cardiology

New Genetic Tool Predicts Unintentional Mutations from CRISPR Edits

CRISPR-Cas9 is a customisable tool that lets scientists cut and insert small pieces of DNA at precise areas along a DNA strand. This lets scientists study our genes in a specific, targeted way. Credit: Ernesto del Aguila III, National Human Genome Research Institute, NIH

Since its breakthrough development more than a decade ago, CRISPR has revolutionised DNA editing across a broad range of fields, including new therapies for an array of disorders spanning cancers, blood conditions and diabetes. But in some cases, the DNA repair process leaves in unintentional, harmful edits. Now, University of California San Diego researchers have developed a new system to understand these repair outcomes and where they can go wrong. The system is described in Nature Communications.

In some designed treatments, patients are injected with CRISPR-treated cells or with packaged CRISPR components with a goal of repairing diseased cells with precision gene edits. Yet, while CRISPR has shown immense promise as a next-generation therapeutic tool, the technology’s edits are still imperfect. CRISPR-based gene therapies can cause unintended but harmful “bystander” edits to parts of the genome, at times leading to new cancers or other diseases.

Unravelling the complex biological dynamics behind both on- and off-target CRISPR edits is daunting, since intricate bodily tissues feature thousands of different cell types and CRISPR edits can depend on many different biological pathways.

Postdoctoral Scholar Zhiqian Li, Professor Ethan Bier and their colleagues developed a sequence analyser to help track on- and off-target mutational edits and the ways they are inherited from one generation to the next. Based on a concept proposed by former UC San Diego researcher David Kosman, the Integrated Classifier Pipeline (ICP) tool can reveal specific categories of mutations resulting from CRISPR editing.

Developed in flies and mosquitoes, the ICP provides a “fingerprint” of how genetic material is being inherited, which allows scientists to follow the source of mutational edits and related risks emerging from potentially problematic edits.

“The ICP system can cleanly establish whether a given individual insect has inherited specific genetic components of the CRISPR machinery from either their mothers or fathers since maternal versus paternal transmission result in totally different fingerprints,” said Bier, a professor in the UC San Diego School of Biological Sciences.

The ICP can help untangle complex biological issues that arise in determining the mechanisms behind CRISPR. While developed in insects, ICP carries vast potential for human applications.

“There are many parallel applications of ICP for analysing and following CRISPR editing outcomes in humans following gene therapy or during tumour progression,” said study first author Li. “This transformative flexible analysis platform has many possible impactful uses to ensure safe application of cutting-edge next-generation health technologies.”

ICP also offers help in tracking inheritance across generations in gene drive systems, which are new technologies designed to spread CRISPR edits in applications such as stopping the transmission of malaria and protecting agricultural crops against pest destruction. For example, researchers could select a single mosquito from the field where a gene-drive test is being conducted and use ICP analysis to determine whether that individual had inherited the genetic construct from its mother or its father, and whether it had inherited a defective element lacking the defining visible markers of that genetic element.

“The CRISPR editing system can be more than 90 percent accurate,” said Bier, “but since it edits over and over again it will eventually make a mistake. The bottom line is that the ICP system can give you a very high-resolution picture of what can go wrong.”

Source: University of California – San Diego

Working outside the Typical 9–5 in Younger Adulthood may be Linked with Worse Health Decades Later

Employees with volatile work schedules early in their career had worse sleep and more depressive symptoms at age 50

Photo by Tim Gouw on Unsplash

The hours you work earlier in life may be associated with worse health years later, according to a study published April 3, 2024 in the open-access journal PLOS ONE by Wen-Jui Han from New York University, US.

Studies have consistently shown that nonstandard work schedules – working outside the traditional nine-to-five workday – can negatively impact physical and mental health as well as social and family life. The current study uses a life-course approach to provide a longer-term perspective on how work schedule patterns throughout a person’s working life impact their health in middle age.

Han used data from The National Longitudinal Survey of Youth-1979 (NLSY79), which includes data on more than 7000 people in the US over 30 years, to see whether employment patterns in younger adulthood were associated with sleep, physical health, and mental health at age 50.

Han found that around a quarter of participants (26%) worked stable standard hours, with a further third (35%) working mostly standard hours. Seventeen percent initially worked standard hours in their 20s, later transitioning into volatile working patterns – a combination of evening, night, and variable hours. Twelve percent initially worked standard hours and then switched to variable hours. A final ten percent were mostly not working over this period.

Compared to individuals who mostly worked during traditional daytime hours throughout their working career, those whose careers featured more volatile work schedules slept less, had lower sleep quality, and were more likely to report depressive symptoms at age 50. The most striking results were seen in those who had stable work hours in their 20s and then transitioned to more volatile work hours in their 30s. This effect size was significant and similar to that of being educated only to below high school level.

Han also found racial and gender-related trends. For example, Black Americans were more likely to have volatile work schedules associated with poorer health, highlighting how some groups may disproportionately shoulder the adverse consequences of such employment patterns.

Han suggests that volatile work schedules are associated with poor sleep, physical fatigue, and emotional exhaustion, which may make us vulnerable to an unhealthy life. The study also suggests that positive and negative impacts of work schedules on health can accumulate over one’s lifetime while highlighting how employment patterns can contribute to health inequities.

Han adds: “Work that is supposed to bring resources to help us sustain a decent life has now become a vulnerability to a healthy life due to the increasing precarity in our work arrangements in this increasingly unequal society. People with vulnerable social positions (eg, females, Blacks, low-education) disproportionately shoulder these health consequences.”