Results from a cancer screening trial indicate that consistent heavy alcohol intake and higher average lifetime drinking are associated with increased risk.
Photo by Apolo Photographer on Unsplash
Studies have demonstrated a link between alcohol consumption and an elevated risk of colorectal cancer. New research now reveals that higher lifetime alcohol consumption is also associated with a higher risk, especially for rectal cancer, and that quitting drinking can lower a person’s risk. The findings are published by Wiley online in CANCER, a peer-reviewed journal of the American Cancer Society.
When investigators analysed data on US adults enrolled in the National Cancer Institute (NCI) Prostate, Long, Colorectal, and Ovarian (PLCO) Cancer Screening Trial who did not have cancer at baseline, they observed that 1679 colorectal cancer cases occurred among 88 092 participants over 20 years of follow-up.
Current drinkers with an average lifetime alcohol intake of ≥ 14 drinks per week (heavy drinkers) had a 25% higher risk of developing colorectal cancer and a 95% higher risk of developing rectal cancer compared with those with an average lifetime alcohol intake of < 1 drink per week (light drinkers).
When further considering drinking consistency, heavy drinking throughout adulthood was linked to a 91% higher risk of colorectal cancer compared with consistent light drinking. In contrast, no evidence of increased colorectal cancer risk was observed among former drinkers, and former drinkers had lower odds of developing noncancerous colorectal tumours, or adenomas (which may go on to become cancerous) than current drinkers averaging < 1 drink per week, suggesting that alcohol cessation may lower individuals’ risks. These data were limited, however.
The association between alcohol consumption and increased risks observed in this and other studies might be explained by carcinogens produced from alcohol metabolism or alcohol’s effects on gut microbes. Additional studies are needed to test whether these mechanisms are involved.
“Our study is one of the first to explore how drinking alcohol over the life course relates to both colorectal adenoma and colorectal cancer risk. While the data on former drinkers were sparse, we were encouraged to see that their risk may return to that of the light drinkers,” said co–senior author Erikka Loftfield, PhD, MPH, of the NCI, part of the National Institutes of Health.
The new device sprays mist to treat deep wound infections without causing kidney damage
Hongmin Sun demonstrating the new device.
A University of Missouri researcher has unveiled a safer, smarter way to fight drug-resistant infections. Hongmin Sun, an associate professor in the School of Medicine, demonstrated that a spray-mist device can deliver last-resort antibiotics directly into infected tissue without the harmful side effects often caused by delivery via the bloodstream.
In a recent study, researchers worked with an industry partner to use a needle-free device to treat methicillin-resistant Staphylococcus aureus (MRSA), a dangerous bacterium that has become resistant to many common antibiotics.
The device successfully delivered the common last-resort antibiotic vancomycin deep into infected tissue without typical side effects such as kidney damage. Unlike topical creams or ointments that are easily wiped away or bloodstream delivery that risks organ damage, the spray-mist technology pushed the medicine through the skin to successfully treat the infection.
Sun collaborated with former Mizzou researcher Lakshmi Pulakat, now a professor of medicine at Tufts University, and Droplette Inc. to use the patented device for antibiotic delivery. The findings pave the way for future clinical trials as researchers seek FDA approval.
The team is hopeful the spray-mist device might one day be used in wound care in challenging settings.
“Whether it’s people with diabetic foot ulcers or soldiers hurt in battle, we wanted to come up with a new approach to treat these severely infected wounds in a more targeted way,” Sun said. “This can be a game-changing therapy for treating those with severely infected wounds.”
Pulakat said the technology is an example of compassionate care.
“This method of delivering last-resort antibiotics could prevent countless amputations and help save lives,” she said. “Dr. Sun is an internationally recognized expert in the field of pathogenic microbiology, and our collaboration with an industry partner has helped make this translational research possible.”
Humans’ exposure to high temperature burn injuries may have played an important role in our evolutionary development, shaping how our bodies heal, fight infection, and sometimes fail under extreme injury, according to new research.
For more than one million years, the control of fire has powered human success, from cooking and heating to technology and industry, driving genetic and cultural evolution and setting us apart from all other species. But this relationship has also exposed humans to high temperature injuries at a scale unmatched in the natural world.
Humans burn themselves – and survive burns – with a frequency likely much greater than any other animal. Most animals avoid fire completely, while in contrast, humans live alongside fire and most humans will experience minor burns throughout their lives.
A new study published in BioEssays, led by Imperial College London researchers, suggests that this increased exposure to burn injuries may have driven notable genetic adaptations which differentiated humans from other primates and mammals. This may also explain both beneficial and maladaptive responses to severe burn injury.
Burn injuries exist on a spectrum of severity, with most small injuries healing on their own while severe burns can lead to lifelong disability or death. Burns damage the skin, the body’s main protective barrier against infection, sometimes over large areas of the body. The longer the skin is damaged, the greater the risk that bacteria can enter the body and cause overwhelming infection.
The researchers argue that natural selection would have favoured traits that helped humans survive small to moderate burns. These may include faster inflammation, faster wound closure (to prevent infection) and stronger pain signals.
However, while these traits are helpful for less severe injuries, they can become harmful for large burns, which may explain why modern humans can experience extreme inflammation, scarring, and organ failure from major burns.
Using comparative genomic data across primates, the researchers found examples of genes associated with burn injury responses which show signs of accelerated evolution in humans. These genes are involved in wound closure, inflammation and immune system response – likely helping to rapidly close wounds and fight infection; a major complication after burn injury, particularly before the widespread use of antibiotics.
These findings support the theory that exposure to burn injuries may have been a notable force on the evolution of humans.
Dr Joshua Cuddihy, lead author for the study, and Honorary Clinical Lecturer in Imperial’s Department of Surgery and Cancer, said: “Burns are a uniquely human injury. No other species lives alongside high temperatures and the regular risk of burning in the way humans do.
“The control of fire is deeply embedded in human life — from a preference for hot food and boiled liquids to the technologies that shape the modern world. As a result, unlike any other species, most humans will burn themselves repeatedly over their lifetime, a pattern that likely extends back over a million years to our earliest use of fire.
“Our research suggests that natural selection favoured traits that improved survival after smaller, more frequent burn injuries. However, those same adaptations may have come with evolutionary trade-offs, helping to explain why humans remain particularly vulnerable to the complications of severe burns.”
The control of fire has powered human success for more than one million years
The study’s novel perspective on human evolution, which could reshape our understanding of modern burn care and human biology, was made possible through interdisciplinary collaboration between clinicians and researchers.
Professor Armand Leroi, Professor of Evolutionary Developmental Biology in Imperial’s Department of Life Sciences, said: “What makes this theory of burn selection so exciting to an evolutionary biologist is that it presents a new form of natural selection – one, moreover, that depends on culture. It is part of the story of what makes us human, and a part that we really did not have any inkling of before.”
Yuemin Li, PhD student at Queen Mary University of London, said: “Our study provides compelling evidence that humans have unique adaptive mutations in several key genes associated with burn injury response.
“These findings could allow us to explore in future research how genetic variations in different groups impact burn injury response, potentially explaining why some patients heal well or poorly after a burn.”
Unlike other wounds from cuts or bites which would have also led to infections, the increased lifetime risk of burns experienced by humans and their hominin ancestors is unique as they are the only species to regularly experience burn injuries and survive them.
The researchers’ findings could change how we study burn injuries, design treatments, and interpret complications of burns. It may also explain why translating results on burn injuries from animal models to humans is often ineffective.
Declan Collins, Consultant in Plastic and Reconstructive Surgery at Chelsea and Westminster Hospital NHS Foundation Trust, said: “Understanding the evolutionary drivers that cause genetic change is an important step in burn research that will influence the way in which we look at scar formation and wound healing.
“The genetic basis for scarring variation in humans and response to tissue injury is still poorly understood, and this work will provide new angles for future research.”
Historical data indicate that men develop coronary heart disease (CHD) 10 years before women. A recent study in the Journal of the American Heart Association indicates that this sex gap still remains.
Investigators analysed data from the Coronary Artery Risk Development in Young Adults (CARDIA) study, in which US adults aged 18–30 years enrolled in 1985–1986 and were followed through August 2020.
Among 5112 participants (54.5% female, 51.6% Black) with an average age of 24.8 years at enrolment and a median follow-up of 34.1 years, men had a significantly higher cumulative incidence of cardiovascular disease. They had higher cumulative incidence rates of the cardiovascular disease subtypes of CHD and heart failure compared with women, but no difference in stroke.
Men reached a 5% incidence of cardiovascular disease 7.0 years earlier than women (50.5 versus 57.5 years). CHD was the most frequent cardiovascular disease subtype, and men reached a 2% incidence 10.1 years earlier than women. There were no significant differences in the age at which men and women reached a 2% incidence for stroke (57.5 versus 56.9 years) or a 1% incidence for heart failure (48.7 versus 51.7 years)
Differences emerged in the fourth decade of life and were not explained after accounting for differences in cardiovascular health.
“Sex differences in cardiovascular disease risk are apparent by age 35, highlighting the importance of initiating risk assessment and prevention strategies in young adulthood,” said corresponding author Alexa Freedman, PhD, of the Northwestern University Feinberg School of Medicine.
Large study of patients in the US, Colombia, Nigeria and India finds symptom burden highest in high-income countries
Photo by Usman Yousaf on Unsplash
Patients with long COVID-19 in the US report far higher rates of brain fog, depression and cognitive symptoms than patients in countries such as India and Nigeria, according to a large international study led by Northwestern Medicine.
The authors note that higher reported symptom burden in the U.S. may reflect lower stigma and greater access to neurological and mental health care, rather than more severe disease.
The study, the first cross-continental comparison of long COVID neurological manifestations, tracked more than 3100 adults with long COVID evaluated at academic medical centres in Chicago; Medellín, Colombia; Lagos, Nigeria; and Jaipur, India.
Among patients who were not hospitalised during their COVID infections (the majority in the study), 86% in the US reported brain fog, compared with only 63% in Nigeria, 62% in Colombia and 15% in India. Rates of psychological distress showed a similar pattern: Nearly 75% of non-hospitalised patients in the U.S. reported symptoms of depression or anxiety, compared with only 40% in Colombia and fewer than 20% in Nigeria and India.
“It is culturally accepted in the U.S. and Colombia to talk about mental health and cognitive issues, whereas that is not the case in Nigeria and India,” said Dr Igor Koralnik, senior study author and chief of neuro-infectious disease and global neurology at Northwestern University Feinberg School of Medicine.
“Cultural denial of mood disorder symptoms as well as a combination of stigma, misperceptions, religiosity and belief systems, and lack of health literacy may contribute to biased reporting. This may be compounded by a dearth of mental health providers and perceived treatment options in those countries.”
Brain fog, fatigue, myalgia (muscle pain), headache, dizziness and sensory disturbances (such as numbness or tingling) were the most common neurological symptoms across all countries
Insomnia was reported by nearly 60% of non-hospitalised US patients, compared with roughly one-third or fewer of patients in Colombia, Nigeria and India
Statistical clustering showed clear separation between high- and upper-middle-income (US, Colombia) and lower-middle-income (Nigeria, India) countries
Building on this work, Koralnik and his international collaborators are now studying cognitive rehabilitation treatments for long COVID brain fog in Colombia and Nigeria, using the same protocols developed for patients treated at the Shirley Ryan AbilityLab in Chicago.
Foxglove helps with heart failure: Cardiologists Professor Dr Udo Bavendiek (left) and Professor Dr Johann Bauersachs have scientifically proven the life-prolonging effect of digitoxin for the first time. Copyright: pixabay, Karin Kaiser/MHH
A multicentre study has demonstrated a clear positive effect of digitoxin in heart failure, which is derived from the red foxglove plant. Results from ten years of research involving more than 1200 participants have clearly confirmed the safety and efficacy of the cardiac glycoside in people diagnosed with heart failure with reduced ejection fraction.
Digitalis has been used to treat heart failure for more than 200 years. The drug digitoxin also belongs to this group of active ingredients known as cardiac glycosides. Although there were indications that digitalis was beneficial in heart failure, it has only now been scientifically proven that digitoxin has a significant positive effect in heart failure due to reduced pumping function and insufficient emptying of the left ventricle – known in medical terms as HFrEF (heart failure with reduced ejection fraction). For ten years, researchers led by Professor Dr Johann Bauersachs, Director of the Department of Cardiology and Angiology at Hanover Medical School (MHH), and senior physician Professor Dr Udo Bavendiek thoroughly investigated the safety and efficacy of the active ingredient in a clinical study involving more than 1200 participants.
The large-scale DIGIT-HF study coordinated by them, involving more than 50 centres in Germany, Austria and Serbia, has now been completed and delivers a clear result: adjunctive therapy with digitoxin reduces mortality and the number of hospitalisations for heart failure in patients with advanced HFrEF. The results have been published in the New England Journal of Medicine. At the same time, they were presented at the end of August 2025 at the European Society of Cardiology Congress in Madrid in the so-called Hot Line Session, where new clinical studies that promise significant changes in patient outcomes are presented.
No proof of effectiveness according to scientific standards to date
Our heart is a high-performance engine. It beats around 70 times per minute, pumping around five litres of blood through our vessels. In doing so, it supplies the body with vital oxygen and nutrients. If this pumping capacity is permanently reduced, physicians refer to this as chronic heart failure or cardiac insufficiency. Around four million people in Germany are affected. Symptoms include shortness of breath, low exercise tolerance, water retention, immobility and severe arrhythmia. The condition is one of the most common reasons for hospitalisation and even death. Until around 2020, digitalis preparations were still on the production list of major pharmaceutical companies. Currently, digitoxin is only produced as a generic drug. ‘However, it remains the most commonly used digitalis preparation in Germany – so far, however, without any scientifically proven evidence of its effectiveness,’ notes Professor Bavendiek.
Can also be used in cases of impaired kidney function
This has now been proven. ‘In the DIGIT-HF study, we examined patients who had exhausted all conventional treatment options,’ says Professor Bauersachs. ‘We were surprised ourselves that we were able to achieve such a significant improvement with the additional digitoxin treatment in these very well-treated study participants.’ The usual medications for heart failure include beta blockers and inhibitors of the renin-angiotensin-aldosterone system, which inhibit excessively activated hormone cascades and thus relieve the heart, as well as diuretics. Defibrillators, which are implanted in the patient’s body, also help against acute arrhythmias. Since 2021, so-called SGLT-2 inhibitors have also been used in Germany. These were originally approved for the treatment of type 2 diabetes, but also have positive effects in all forms of heart failure. Thanks to the DIGIT-HF study, digitoxin could now become another mainstay in the treatment of people diagnosed with HFrEF.
Previous clinical studies have been conducted almost exclusively with digoxin, another cardiac glycoside. However, digoxin can only be used to a limited extent in patients with impaired kidney function, which is often the case in patients with advanced heart failure, as it is excreted almost exclusively via the kidneys. ‘With digitoxin, however, the situation is different,’ explains Professor Bavendiek. This is because digitoxin is excreted via the liver and intestines to a greater extent in patients with impaired kidney function. The already approved drug is therefore also suitable for use in patients with pre-existing kidney weakness.
Safe and cost-effective
In addition, the results of the DIGIT-HF study dispelled fears that digitoxin is dangerous for certain groups of patients with heart failure and could lead to death. ‘When dosed correctly, digitoxin is a safe treatment for heart failure and is also suitable for controlling heart rate in atrial fibrillation when beta blockers alone are not sufficient,’ emphasises Professor Bavendiek. Another advantage of the drug sounds trivial, but is certainly interesting in view of rising healthcare costs: digitoxin costs just a few pence and is drastically cheaper than other drugs for heart failure. Based on the study data available to date, heart specialists have already developed recommendations for simple and safe dosing. Whereas 0.1 milligrams of digitoxin were often prescribed in the past, the current recommendations are 0.07 milligrams per day or even less. The DIGIT-HF study showed that this dosage reduced mortality and hospital admissions due to heart failure without any safety issues.
Study raises questions around why female individuals are diagnosed later than males
Photo by Ben Wicks on Unsplash
Autism has long been viewed as a condition that predominantly affects male individuals, but a study from Sweden published by The BMJ shows that autism may actually occur at comparable rates among male and female individuals.
The results show a clear female catch-up effect during adolescence, which the researchers say highlights the need to investigate why female individuals receive diagnoses later than male individuals.
The prevalence of autism spectrum disorder (ASD) has increased over the past three decades, with a high male-to-female diagnosis ratio of around 4:1.
The increase in prevalence is thought to be linked to factors including wider diagnostic criteria and societal changes (eg, parental age), whilst the high male to female ratio has been attributed to better social and communication skills among girls, making autism more difficult to spot. However so far no large study has examined these trends over the life course.
To address this, researchers used national registers to analyse diagnosis rates of autism for 2.7 million individuals born in Sweden between 1985 and 2022 who were tracked from birth to a maximum of 37 years of age.
During this follow-up period of more than 35 years, autism was diagnosed in 78,522 (2.8%) of individuals at an average age of 14.3 years.
Diagnosis rates increased with each five year age interval throughout childhood, peaking at 645.5 per 100,000 person years for male individuals at age 10-14 years and 602.6 for female individuals at age 15-19 years.
However, while male individuals were more likely to have a diagnosis of autism in childhood, female individuals caught up during adolescence, giving a male to female ratio approaching 1:1 by age 20 years.
This is an observational study and the authors acknowledge that they did not consider other conditions associated with autism, such as ADHD and intellectual disability. Nor were they able to control for shared genetic and environmental conditions like parental mental health.
However, they say the study size and duration enabled them to link data for a whole population and disentangle the effects of three different time scales: age, calendar period and birth cohort.
As such, they write: “These findings indicate that the male to female ratio for autism has decreased over time and with increasing age at diagnosis. This male to female ratio may therefore be substantially lower than previously thought, to the extent that, in Sweden, it may no longer be distinguishable by adulthood.”
“These observations highlight the need to investigate why female individuals receive diagnoses later than male individuals,” they conclude.
These findings align with recent research and seem to support the argument that current practices may be failing to recognise autism in many women until later in life, if at all, says Anne Cary, patient and patient advocate, in a linked editorial.
She notes that studies like this are essential to changing the assumption that autism is more prevalent in male individuals than in female individuals, but points out that as autistic female individuals await proper diagnosis, “they are likely to be (mis)diagnosed with psychiatric conditions, especially mood and personality disorders, and they are forced to self-advocate to be seen and treated appropriately: as autistic patients, just as autistic as their male counterparts.”
The development of a self-regulating, implantable living technology that could offer hope for millions with diabetes and other chronic diseases
The crystal capsules developed by the researchers. They made the cover of Science Translational Medicine.
A pioneering study marks a major step toward eliminating the need for daily insulin injections for people with diabetes. The research introduces a living, cell-based implant that can function as an autonomous artificial pancreas, essentially a living drug that is long-term, thanks to a novel crystalline shield technology.
Once implanted, the system operates entirely on its own: it continuously senses blood-glucose levels, produces insulin within the implant itself, and releases the exact amount needed – precisely when it is needed. In effect, the implant becomes a self-regulating, drug-manufacturing organ inside the body, requiring no external pumps, injections, or patient intervention.
One of the study’s most significant breakthroughs addresses the longstanding challenge of immune rejection, which has limited the success of cell-based therapies for decades. The researchers developed engineered therapeutic crystals that shield the implant from the immune system, preventing it from being recognised as a foreign object. This protective strategy enables the implant to function reliably and continuously for several years.
The technology has already been successfully tested in a mouse model for effective and long-term regulation of glucose levels and in non-human primates for cell viability and functionality. These results represent a critical milestone and strongly support the potential for future translation to human patients.
From Postdoctoral Insight to Global Collaboration
The study was led by Assistant Professor Shady Farah of the Faculty of Chemical Engineering at the Technion – Israel Institute of Technology, in co-correspondence with MIT, and in collaboration with Harvard University, Johns Hopkins University, and the University of Massachusetts. Asst Prof Farah began developing the concept with colleagues in 2018 during his postdoctoral fellowship at MIT and Boston Children’s Hospital/Harvard Medical School, under the supervision of Prof Daniel Anderson and Prof Robert (Bob) Langer, a world leader in tissue engineering and co-founder of Moderna.
Today, the research continues in Asst Prof Farah’s laboratory at the Technion, in close collaboration with leading US institutions, including MIT, Harvard, the University of Massachusetts, Boston Children’s Hospital, and the Johns Hopkins University School of Medicine.
A Platform with Far-Reaching Potential
While the immediate focus is diabetes, the researchers emphasise that this implantable, closed-loop platform could be adapted to treat a wide range of chronic conditions requiring continuous delivery of biological therapeutics – including haemophilia and other metabolic or genetic diseases.
If successfully translated to the clinic, this technology could redefine how chronic diseases are treated, shifting from repeated drug administration to living, self-regulating therapies that work seamlessly from within.
A University of Exeter-led study has quantified the role of obesity in common long-term conditions, showing for the first time the effect of losing weight in preventing multiple diseases.
Conditions that often occur together may share an underlying cause, which can be key to prevention or treatment. The picture of which conditions co-occur is complex, so researchers paired them together, to allow them to identify shared causes more simply. The study found that obesity is the main shared cause between ten pairs of commonly occurring conditions.
The research specifically measured how much weight reduction would reduce the risk of the next diagnosis. In the largest study of its kind, published in Communications Medicine – Nature, the team studied 71 conditions which often occur together, such as type 2 diabetes and osteoarthritis, or kidney disease and chronic obstructive pulmonary disease (COPD).
The GEMINI study, funded by the UKRI Medical Research Council and supported by the National Institute for Health and Care Research (NIHR), used genetics and healthcare data drawn from a number of large datasets internationally. They found that obesity was part of the cause for 61 of the 71 conditions. They also found that obesity explained all of the genetic overlap in ten pairs of conditions, suggesting it is the main driver for why they frequently occur together.
Body mass index, or BMI, is a scaled measure of weight – a number over 30 units indicates obesity, while less than 25 indicates “normal” weight. The study quantified how much a reduction in BMI would reduce the risk of both conditions at a population level for people overweight or living with obesity. For example, for every thousand people who have both chronic kidney disease and osteoarthritis, a BMI reduction of 4.5 units would have prevented 17 of them developing both conditions or nine people per thousand with type 2 diabetes and osteoarthritis.
The team also established the pairs of conditions where obesity is not the main cause and are now investigating other mechanisms.
Study lead Professor Jack Bowden, at the University of Exeter Medical School: “We’ve long known that certain diseases often occur together, and also that obesity increases the risk of many diseases. This largescale study is the first to use genetics to quantify the role of obesity in causing diseases to occur in the same individuals. We found that for some disease pairings, obesity is the major driving force. Our research provides much more detail about the links between obesity and disease, which will help clinicians target specific advice to patients going forward.”
Study author Professor Jane Masoli, of the University of Exeter Medical School, who is a Consultant Geriatrician and regional NIHR Ageing lead, said: “Currently nine million people in the UK live with two or more long-term conditions. Understanding how to prevent diseases accumulating is a key national research and healthcare priority. This study further strengthens the case to tackle obesity through public health programmes, reinforcing the importance of lifelong obesity management in the NHS strategy on prevention. Our work shows that this could reduce the risk of accumulating multiple health conditions, supporting people to live longer, healthier lives.”
This research represents another important publication from the GEMINI (Genetic Evaluation of Multimorbidity towards INdividualisation of Interventions) collaborative. Led by the University of Exeter, GEMINI includes people with multimorbidity, health care professionals including those in primary care and experts in statistics and genetics, and was one of six programmes funded by the UKRI strategic priorities fund, an £830 million investment in multimorbidity research.
The GEMINI team are working to further understand why some conditions more frequently co-occur in the same patients. The team are quantifying the role of other, known modifiable risk factors beyond obesity, and are finding novel genes and pathways that could point to new ways to intervene and improve health. GEMINI data, results, and code are free to download (https://github.com/GEMINI-multimorbidity), and the pairwise genetic and observational correlations can be viewed interactively (https://gemini-multimorbidity.shinyapps.io/atlas/).
Antioxidants have been marketed as miracle supplements, touted for preventing chronic diseases and cancers; treating COPD and dementia; and slowing aging. While antioxidant therapies are widely used to treat male infertility, a new study from the Texas A&M College of Veterinary Medicine and Biomedical Sciences (VMBS) found that regularly consuming high doses of antioxidants negatively influences sperm DNA and may lead to offspring born with differences in craniofacial development.
In a study, published in the journal Frontiers in Cell and Developmental Biology, a team of researchers led by Dr Michael Golding examined the effects of N-acetyl-L-cysteine (NAC) and selenium (Se) – two widely used antioxidants – in mouse models.
They found that offspring of male mice exposed to antioxidants for six weeks exhibited skull and facial shape differences, even while the father’s health didn’t change.
These findings suggest that men should exercise caution when consuming high doses of antioxidants, especially if they’re planning to have children in the near future.
When good goes too far
Antioxidants like NAC — which is a key ingredient in many nutritional supplements, including multivitamins — are often used to treat oxidative stress, which can be caused by excessive alcohol consumption.
Because Golding’s lab has been studying the effects of parental alcohol consumption on offspring – and has successfully correlated this consumption to a whole host of issues in children born to males who consumed excessive amounts of alcohol, including craniofacial abnormalities – his team was interested in the impacts of adding NAC or Se to a male mouse’s diet.
“We know alcohol causes oxidative stress and we were looking to push back on it by adding a supplement known to lower oxidative stress,” said Golding, a professor in the VMBS’ Department of Veterinary Physiology and Pharmacology. “When we realised that offspring born to males that had only been given NAC were displaying skull and facial differences, it was a surprise because this molecule is universally thought to be good.
“When we sat down to think it through, we realised that it makes sense – you take a multivitamin to ensure that you’re in balance, but if the thing that you’re taking to ensure you’re in balance is unbalanced (the dose of antioxidants is too high), then you’re not doing a good thing.”
It is well established that high doses of antioxidants can have negative impacts; research has proven that antioxidants can diminish the effects of exercise in endurance athletes, for example, and, in the case of professional athletes, can lead to negative outcomes in performance metrics.
“Sperm health is another performance metric; it’s just not one that we think about in everyday life,” Golding said. “If you’re taking a high dose antioxidant, you could be diminishing your reproductive fitness and part of the journey toward the bad outcome is going to be the effects on the offspring.”
What the face reveals about the brain
Among their unanticipated findings was that female offspring, in particular, exhibited significantly closer-set eyes and smaller skulls, which are also symptoms of foetal alcohol syndrome.
“There’s a very commonly accepted truism in paediatric medicine that the face mirrors the brain, because the brain and the face form at the same time,” Golding said. “When your face migrates (during gestation), it’s using cues from your brain to know where to go, and if the two things are not aligned, there’s either a delay or some kind of abnormality in brain development.
“So, if you see abnormalities in the midline of the face, you’re probably going to see midline abnormalities in the brain,” he said. “People with these abnormalities typically have problems with impulse control, neurological conditions like epilepsy, and other developmental issues.”
Whether the offspring in this project will exhibit central nervous system dysfunction will require further study.
The dose makes the difference
While the lab continues to research this “unexplored frontier,” Golding says in the case of antioxidants, too much of a good thing can, in fact, be too much, especially in the absence of a medical reason to take an antioxidant supplement.
Because many men regularly take high doses of these supplements – including products that contain antioxidant-rich ingredients – it’s important to pay attention to how much of these compounds are listed on the label. This includes NAC, which is one of the key ingredients in many multivitamins and is often found in high doses in these pills.
“The larger message here is that there’s a balance,” Golding said. “Think of yourself as a plant — if you stick your plant out in the sun too long, it’s going to get dehydrated. If you overwater your plant, it gets root rot. But if you have the right balance of sunshine and water, that’s when growth occurs. Health is in that domain.
“If your vitamins are providing 1,000% of the recommended daily amount, you should be cautious,” he said. “If you stick to the 100% range, then you should be OK.”