New paper critically examines the US Food and Drug Administration’s regulatory framework for artificial intelligence-powered healthcare products, highlighting gaps in safety evaluations, post-market surveillance, and ethical considerations.
An agile, transparent, and ethics-driven oversight system is needed for the U.S. Food and Drug Administration (FDA) to balance innovation with patient safety when it comes to artificial intelligence-driven medical technologies. That is the takeaway from a new report issued to the FDA, published this week in the open-access journal PLOS Medicineby Leo Celi of the Massachusetts Institute of Technology, and colleagues.
Artificial intelligence is becoming a powerful force in healthcare, helping doctors diagnose diseases, monitor patients, and even recommend treatments. Unlike traditional medical devices, many AI tools continue to learn and change after they’ve been approved, meaning their behaviour can shift in unpredictable ways once they’re in use.
In the new paper, Celi and his colleagues argue that the FDA’s current system is not set up to keep tabs on these post-approval changes. Their analysis calls for stronger rules around transparency and bias, especially to protect vulnerable populations. If an algorithm is trained mostly on data from one group of people, it may make mistakes when used with others. The authors recommend that developers be required to share information about how their AI models were trained and tested, and that the FDA involve patients and community advocates more directly in decision-making. They also suggest practical fixes, including creating public data repositories to track how AI performs in the real world, offering tax incentives for companies that follow ethical practices, and training medical students to critically evaluate AI tools.
“This work has the potential to drive real-world impact by prompting the FDA to rethink existing oversight mechanisms for AI-enabled medical technologies. We advocate for a patient-centred, risk-aware, and continuously adaptive regulatory approach – one that ensures AI remains an asset to clinical practice without compromising safety or exacerbating healthcare disparities,” the authors say.
Comparing intermittent fasting with traditional daily calorie restriction, researchers at the University of Colorado Anschutz Medical Campus found greater weight loss among the intermittent fasting group, a significant finding given that most previous studies reported no notable difference between the two diet strategies.
Singling out the 4:3 plan of the popular intermittent fasting (IMF) model – where dieters eat freely four days a week with three days a week of intense calorie restriction – the researchers found an average body weight loss of 7.6% among IMF participants at the one-year mark compared with 5% in the daily caloric restriction (DCR) group.
“It was surprising and exciting to me that it was better,” said Victoria Catenacci, MD, co-lead author and associate professor of endocrinology at the CU School of Medicine.
“The more important message to me is that this is a dietary strategy that is an evidence-based alternative, especially for people who have tried DCR and found it difficult,” Catenacci said, noting the weight-loss difference was modest.
An endocrinologist who specializes in obesity medicine, Catenacci’s work targets a decades-long health crisis in this country, with 40% of Americans 20 and older meeting the medical criteria for obesity. She works at the CU Anschutz Health and Wellness Center (AHWC), the study’s primary site.
She and co-lead author Danielle Ostendorf, PhD, who worked on the study as a post-doctoral fellow with Catenacci in 2018 and has since moved to the University of Tennessee Knoxville, share more about the research in the Q&A below.
While you’re probably not pouring your morning cup for the long-term health benefits, coffee consumption has been linked to lower risk of mortality. In a new observational study, researchers from Tufts University found the association between coffee consumption and mortality risk changes with the amount of sweeteners and saturated fat added to the beverage.
The study, published online in TheJournal of Nutrition, found that consumption of 1-2 cups of caffeinated coffee per day was linked to a lower risk of death from all causes and death from cardiovascular disease. Black coffee and coffee with low levels of added sugar and saturated fat were associated with a 14% lower risk of all-cause mortality as compared to no coffee consumption. The same link was not observed for coffee with high amounts of added sugar and saturated fat.
“Coffee is among the most-consumed beverages in the world, and with nearly half of American adults reporting drinking at least one cup per day, it’s important for us to know what it might mean for health,” said study senior author Professor Fang Fang Zhang at the Gerald J. and Dorothy R. Friedman School of Nutrition Science and Policy. “The health benefits of coffee might be attributable to its bioactive compounds, but our results suggest that the addition of sugar and saturated fat may reduce the mortality benefits.”
The study analysed data from nine consecutive cycles of the National Health and Nutrition Examination Survey (NHANES) from 1999 to 2018, linked to National Death Index Mortality Data. The study included a nationally representative sample of 46 000 adults aged 20 years and older who completed valid first-day 24-hour dietary recalls. Coffee consumption was categorised by type (caffeinated or decaffeinated), sugar, and saturated fat content. Mortality outcomes included all-cause, cancer, and cardiovascular disease. Low added sugar (from granulated sugar, honey, and syrup) was defined as under 5% of the Daily Value, which is 2.5 grams per 8-ounce [250mL] cup or approximately half a teaspoon of sugar. Low saturated fat (from milk, cream, and half-and-half) was defined as 5% of the Daily Value, or 1 gram per 250mL cup or the equivalent of 5 tablespoons of 2% milk, 1 tablespoon of light cream, or 1 tablespoon of half-and-half.
In the study, consumption of at least one cup per day was associated with a 16% lower risk of all-cause mortality. At 2-3 cups per day, the link rose to 17%. Consumption beyond three cups per day was not associated with additional reductions, and the link between coffee and a lower risk of death by cardiovascular disease weakened when coffee consumption was more than three cups per day. No significant associations were seen between coffee consumption and cancer mortality.
“Few studies have examined how coffee additives could impact the link between coffee consumption and mortality risk, and our study is among the first to quantify how much sweetener and saturated fat are being added,” said first author Bingjie Zhou, a recent Ph.D. graduate from the nutrition epidemiology and data science program at the Friedman School. “Our results align with the Dietary Guidelines for Americans which recommend limiting added sugar and saturated fat.”
Limitations of the study include the fact that self-reported recall data is subject to measurement error due to day-to-day variations in food intake. The lack of significant associations between decaffeinated coffee and all-cause mortality could be due to the low consumption among the population studied.
New USC research indicates how iron-related oxidative damage and cell death may hasten the development of Alzheimer’s disease in people with Down syndrome
Scientists at the University of Southern Carolina have discovered a key connection between high levels of iron in the brain and increased cell damage in people who have both Down syndrome and Alzheimer’s disease.
In the study, researchers found that the brains of people diagnosed with Down syndrome and Alzheimer’s disease (DSAD) had twice as much iron and more signs of oxidative damage in cell membranes compared to the brains of individuals with Alzheimer’s disease alone or those with neither diagnosis. The results, published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association, point to a specific cellular death process that is mediated by iron, and the findings may help explain why Alzheimer’s symptoms often appear earlier and more severely in individuals with Down syndrome.
“This is a major clue that helps explain the unique and early changes we see in the brains of people with Down syndrome who develop Alzheimer’s,” said Max Thorwald, lead author of the study and a postdoctoral fellow in the laboratory of University Professor Emeritus Caleb Finch at the USC Leonard Davis School. “We’ve known for a long time that people with Down syndrome are more likely to develop Alzheimer’s disease, but now we’re beginning to understand how increased iron in the brain might be making things worse.”
Down syndrome and Alzheimer’s
Down syndrome is caused by having an extra third copy, or trisomy, of chromosome 21. This chromosome includes the gene for amyloid precursor protein, or APP, which is involved in the production of amyloid-beta (Aβ), the sticky protein that forms telltale plaques in the brains of people with Alzheimer’s disease.
Because people with Down syndrome have three copies of the APP gene instead of two, they tend to produce more of this protein. By the age of 60, about half of all people with Down syndrome show signs of Alzheimer’s disease, which is approximately 20 years earlier than in the general population.
“This makes understanding the biology of Down syndrome incredibly important for Alzheimer’s research,” said Finch, the study’s senior author.
Key findings point to ferroptosis
The research team studied donated brain tissue from individuals with Alzheimer’s, DSAD, and those without either diagnosis. They focused on the prefrontal cortex — an area of the brain involved in thinking, planning, and memory — and made several important discoveries:
Iron levels much higher in DSAD brains: Compared to the other groups, DSAD brains had twice the amount of iron in the prefrontal cortex. Scientists believe this buildup comes from tiny brain blood vessel leaks called microbleeds, which occur more frequently in DSAD than in Alzheimer’s and are correlated with higher amounts of APP.
More damage to lipid-rich cell membranes: Cell membranes are made of fatty compounds called lipids and can be easily damaged by chemical stress. In DSAD brains, the team found more byproducts of this type of damage, known as lipid peroxidation, compared to amounts in Alzheimer’s-only or control brains.
Weakened antioxidant defense systems: The team found that the activity of several key enzymes that protect the brain from oxidative damage and repair cell membranes was lower in DSAD brains, especially in areas of the cell membrane called lipid rafts.
Together, these findings indicate increased ferroptosis, a type of cell death characterised by iron-dependent lipid peroxidation, Thorwald explained: “Essentially, iron builds up, drives the oxidation that damages cell membranes, and overwhelms the cell’s ability to protect itself.”
Lipid rafts: a hotspot for brain changes
The researchers paid close attention to lipid rafts — tiny parts of the brain cell membrane that play crucial roles in cell signalling and regulate how proteins like APP are processed. They found that in DSAD brains, lipid rafts had much more oxidative damage and fewer protective enzymes compared to Alzheimer’s or healthy brains.
Notably, these lipid rafts also showed increased activity of the enzyme β-secretase, which interacts with APP to produce Aβ proteins. The combination of more damage and more Aβ production may promote the growth of amyloid plaques, thus speeding up Alzheimer’s progression in people with Down syndrome, Finch explained.
Rare Down syndrome variants offer insight
The researchers also studied rare cases of individuals with “mosaic” or “partial” Down syndrome, in which the third copy of chromosome 21 is only present in a smaller subset of the body’s cells. These individuals had lower levels of APP and iron in their brains and tended to live longer. In contrast, people with full trisomy 21 and DSAD had shorter lifespans and higher levels of brain damage.
“These cases really support the idea that the amount of APP — and the iron that comes with it — matters a lot in how the disease progresses,” Finch said.
Looking ahead
The team says their findings could help guide future treatments, especially for people with Down syndrome who are at high risk of Alzheimer’s. Early research in mice suggests that iron-chelating treatments, in which medicine binds to the metal ions and allows them to leave the body, may reduce indicators of Alzheimer’s pathology, Thorwald noted.
“Medications that remove iron from the brain or help strengthen antioxidant systems might offer new hope,” Thorwald said. “We’re now seeing how important it is to treat not just the amyloid plaques themselves but also the factors that may be hastening the development of those plaques.”
For many, fitness trackers have become indispensable tools for monitoring how many calories they’ve burned in a day. But for those living with obesity, who are known to exhibit differences in walking gait, speed, energy burned and more, these devices often inaccurately measure activity – until now.
Scientists at Northwestern University have developed a new algorithm that enables smartwatches to more accurately monitor the calories burned by people with obesity during various physical activities.
The technology bridges a critical gap in fitness technology, said Nabil Alshurafa, whose Northwestern lab, HABits Lab, created and tested the open-source, dominant-wrist algorithm specifically tuned for people with obesity. It is transparent, rigorously testable and ready for other researchers to build upon. Their next step is to deploy an activity-monitoring app later this year that will be available for both iOS and Android use.
“People with obesity could gain major health insights from activity trackers, but most current devices miss the mark,” said Alshurafa, associate professor of behavioral medicine at Northwestern University Feinberg School of Medicine.
Current activity-monitoring algorithms that fitness trackers use were built for people without obesity. Hip-worn trackers often misread energy burn because of gait changes and device tilt in people with higher body weight, Alshurafa said. And lastly, wrist-worn models promise better comfort, adherence and accuracy across body types, but no one has rigorously tested or calibrated them for this group, he said.
“Without a validated algorithm for wrist devices, we’re still in the dark about exactly how much activity and energy people with obesity really get each day — slowing our ability to tailor interventions and improve health outcomes,” said Alshurafa, whose team tested his lab’s algorithm against 11 state-of-the-art algorithms designed by researchers using research-grade devices and used wearable cameras to catch every moment when wrist sensors missed the mark on calorie burn.
The findings will be published June 19 in Nature Scientific Reports.
The exercise class that motivated the research
Alshurafa was motivated to create the algorithm after attending an exercise class with his mother-in-law who has obesity.
“She worked harder than anyone else, yet when we glanced at the leaderboard, her numbers barely registered,” Alshurafa said. “That moment hit me: fitness shouldn’t feel like a trap for the people who need it most.”
Algorithm rivals gold-standard methods
By using data from commercial fitness trackers, the new model rivals gold-standard methods of measuring energy burn and can estimate how much energy someone with obesity is using every minute, achieving over 95% accuracy in real-world situations. This advancement makes it easier for more people with obesity to track their daily activities and energy use, Alshurafa said.
How the study measured energy burn
In one group, 27 study participants wore a fitness tracker and metabolic cart – a mask that measures the volume of oxygen the wearer inhales and the volume of carbon dioxide the wearer exhales to calculate their energy burn (in kilocalories/kCals) and resting metabolic rate. The study participants went through a set of physical activities to measure their energy burn during each task. The scientists then looked at the fitness tracker results to see how they compared to the metabolic cart results.
In another group, 25 study participants wore a fitness tracker and body camera while just living their lives. The body camera allowed the scientists to visually confirm when the algorithm over- or under-estimated kCals.
At times, Alshurafa said he would challenge study participants to do as many pushups as they could in five minutes.
“Many couldn’t drop to the floor, but each one crushed wall-pushups, their arms shaking with effort,” he said, “We celebrate ‘standard’ workouts as the ultimate test, but those standards leave out so many people. These experiences showed me we must rethink how gyms, trackers and exercise programs measure success – so no one’s hard work goes unseen.”
A group of infectious disease and public health experts are calling on the Department of Health and Minister Aaron Motsoaledi to reintroduce a national action plan addressing antimicrobial resistance (AMR).
An open letter from over 70 doctors, scientists and public health advisors states that antibiotic resistance is becoming a “growing threat” in the country and poses a threat to universal health coverage through the National Health Insurance.
Latest figures show that over one-million deaths a year worldwide are directly caused by AMR. This number is projected to increase. Nearly five-million people die with an antibiotic-resistant infection. Over the next 25 years, nearly 40-million people are projected to die from AMR.
The open letter also called on the department to reinstate a ministerial advisory committee on AMR or to establish a similar scientific body.
“The lack of a robust scientific advisory body limits the government’s capacity to develop evidence-based policies,” the letter reads. The establishment of a scientific body would “empower the government to make strategic, data-driven decisions to combat this pressing health threat effectively”.
The former Ministerial Advisory Committee was disbanded in November 2023.
Marc Mendelson, an infectious disease specialist at Groote Schuur Hospital who has been outspoken about the threat of AMR for many years, said: “AMR is a current pandemic which is wreaking havoc, is not being attended to properly and not being taken seriously enough in South Africa.”
Mendelson said that there are “more and more people having to be treated for highly resistant bacterial infections in our healthcare system”. AMR leads to an increase in morbidity, mortality, hospital costs, and also has socio-economic consequences, he said. Common medical interventions such as surgery “becomes much riskier” with AMR.
Department of Health spokesperson Foster Mohale said that the department would only comment once the letter was formally presented, which is expected to happen at 5pm on Thursday.
Self-esteem scores more than doubled within one year of weight-loss surgery, according to a new study* presented at the American Society for Metabolic and Bariatric Surgery (ASMBS) 2025 Annual Scientific Meeting.
Researchers from Geisinger Medical Center found that after bariatric surgery self-esteem scores rose to 77.5 from 33.6 – a more than 40-point increase. The higher the score on a scale from 0 to 100, the higher the level of self-esteem and quality of life. The amount of weight loss appears to fuel the increase in self-esteem — scores were highest among those who lost the most weight despite demographics differences including gender, age, and race or type of bariatric procedure.
Researchers used a prospectively maintained database to identify 5,749 patients aged 18 and older with body mass index (BMI) of 35 or more who had metabolic and bariatric surgery between 2006 and 2019. Patients completed the Impact of Weight Quality of Life (iwQOL) survey pre-operatively and 12 months after the operation to assess weight stigma and their quality of life.
“Understanding weight stigma and psychosocial factors associated with obesity is essential to offering holistic care. While these factors should not dictate the decision to have bariatric surgery, they should be an important part of the conversation,” said study co-author Justin Dhyani, MD, Geisinger Medical Center in Danville, PA.
Weight stigma is associated with adverse health outcomes including depression, anxiety, disordered eating, and low self-esteem. Among adults with obesity, the prevalence of weight discrimination is 19% to 42%, with higher rates reported among those with higher BMIs and women.
“Weight stigma is a serious issue that places an extra psychological burden on patients struggling with obesity and there is no excuse for it,” said Ann M. Rogers, MD, MD, FACS, FASMBS, President, ASMBS, who was not involved in the study. “This study shows we need to understand what patients are going through and be supportive and empowering of them as they navigate their health and make decisions about treatment.”
Genetic material shed by tumours can be detected in the bloodstream three years prior to cancer diagnosis, according to a study led by investigators at Johns Hopkins.
The study, partly funded by the National Institutes of Health, was published in Cancer Discovery.
Investigators were surprised they could detect cancer-derived mutations in the blood so much earlier, says lead study author Yuxuan Wang, MD, PhD, an assistant professor of oncology at the Johns Hopkins University School of Medicine. “Three years earlier provides time for intervention. The tumours are likely to be much less advanced and more likely to be curable.”
To determine how early cancers could be detected prior to clinical signs or symptoms, Wang and colleagues assessed plasma samples that were collected for the Atherosclerosis Risk in Communities (ARIC) study, a large National Institutes of Health-funded study to investigate risk factors for heart attack, stroke, heart failure and other cardiovascular diseases. They used highly accurate and sensitive sequencing techniques to analyse blood samples from 26 participants in the ARIC study who were diagnosed with cancer within six months after sample collection, and 26 from similar participants who were not diagnosed with cancer.
At the time of blood sample collection, eight of these 52 participants scored positively on a multicancer early detection (MCED) laboratory test. All eight were diagnosed within four months following blood collection. For six of the eight individuals, investigators also were able to assess additional blood samples collected 3.1–3.5 years prior to diagnosis, and in four of these cases, tumour-derived mutations could also be identified in samples taken at the earlier timepoint.
“This study shows the promise of MCED tests in detecting cancers very early, and sets the benchmark sensitivities required for their success,” says Bert Vogelstein, MD, Clayton Professor of Oncology, co-director of the Ludwig Center at Johns Hopkins and a senior author on the study.
“Detecting cancers years before their clinical diagnosis could help provide management with a more favourable outcome,” adds Nickolas Papadopoulos, PhD, professor of oncology, Ludwig Center investigator and senior author of the study. “Of course, we need to determine the appropriate clinical follow-up after a positive test for such cancers.”
While scientists have long known that different senses activate different parts of the brain, a new Yale-led study indicates that multiple senses all stimulate a critical region deep in the brain that controls consciousness.
The study, published in the journal NeuroImage, sheds new light on how sensory perception works in the brain and may fuel the development of therapies to treat disorders involving attention, arousal, and consciousness.
In the study, a research team led by Yale’s Aya Khalaf focused on the workings of subcortical arousal systems, brain structure networks that play a crucial role in regulating sleep-wake states. Previous studies on patients with disorders of consciousness, such as coma or epilepsy, have confirmed the influence of these systems on states of consciousness.
But prior research has been largely limited to tracking individual senses. For the new study, researchers asked if stimuli from multiple senses share the same subcortical arousal networks. They also looked at how shifts in a subject’s attention might affect these networks.
For the study, researchers analysed fMRI (functional magnetic resonance imaging) datasets collected from 1,561 healthy adult participants as they performed 11 different tasks using four senses: vision, audition, taste, and touch.
They made two important discoveries: that sensory input does make use of shared subcortical systems and, more surprisingly, that all input, regardless of which sense delivered the signal, stimulates activity in two deep brain regions, the midbrain reticular formation and the central thalamus, when a subject is sharply focused on the senses.
The key to stimulating the critical central brain regions, they found, were the sudden shifts in attention demanded by the tasks.
“We were expecting to find activity on shared networks, but when we saw all the senses light up the same central brain regions while a test subject was focusing, it was really astonishing,” said Khalaf, a postdoctoral associate in neurology at Yale School of Medicine and lead author of the study.
The discovery highlighted how key these central brain regions are in regulating not only disorders of consciousness, but also conditions that impact attention and focus, such as attention deficit hyperactivity disorder. This finding could lead to better targeted medications and brain stimulation techniques for patients.
“This has also given us insights into how things work normally in the brain,” said senior author Hal Blumenfeld, the Mark Loughridge and Michele Williams Professor of Neurology who is also a professor in neuroscience and neurosurgery and director of the Yale Clinical Neuroscience Imaging Center. “It’s really a step forward in our understanding of awareness and consciousness.”
Looking across senses, this is the first time researchers have seen a result like this, said Khalaf, who is also part of Blumenfeld’s lab.
“It tells us how important this brain region is and what it could mean in efforts to restore consciousness,” she said.
The potential role of vitamin D in preventing and treating colorectal cancer (CRC) has attracted growing research interest – especially as CRC rates are rising, particularly among younger adults. This isn’t a new area of study. Low vitamin D levels have long been linked to a higher risk of developing colorectal cancer.
One large study involving over 12 000 participants found that people with low blood levels of vitamin D had a 31% greater risk of developing CRC compared to those with higher levels. Similarly, another study reported a 25% lower CRC risk among individuals with high dietary vitamin D intake.
Data from the Nurses’ Health Study – a long-term investigation of American nurses – showed that women with the highest vitamin D intake had a 58% lower risk of developing colorectal cancer compared to those with the lowest intake.
Now, a review highlights vitamin D’s promise in colorectal cancer prevention and treatment – but also underscores the complexity and contradictions in current research.
While observational data, which follow people’s use of vitamin D, and mechanistic studies, to investigate how vitamin D works in the laboratory, suggest protective effects, this isn’t confirmed by larger trials.
In fact, randomised controlled trials (RCTs), in which some people receive vitamin D and others don’t, the gold standard by which treatments are judged, reveal inconsistent outcomes. This highlights the need for a balanced approach to its integration into public health strategies.
Vitamin D is synthesised in the skin in response to sunlight and exerts its biological effects through vitamin D receptors (VDRs) found throughout the body, including in colon tissue. When activated, these receptors help regulate gene activity related to inflammation, immune response and cell growth – processes central to cancer development and progression.
Preclinical studies have shown that the active form of vitamin D (calcitriol) can suppress inflammation, boost immune surveillance (the immune system’s ability to detect abnormal cells), inhibit tumour blood vessel growth and regulate cell division – a key factor in cancer development, as demonstrated in my recent research.
Epidemiological studies, which track health outcomes across large populations over time, consistently find that people with higher blood levels of vitamin D have a lower risk of developing CRC. This paints a hopeful picture, suggesting that something as simple as getting more vitamin D – via sun exposure, diet, or supplements – could lower cancer risk.
But the story gets more complicated.
Mixed results
When it comes to medical decision-making, randomised controlled trials (RCTs) are the gold standard. These studies randomly assign participants to receive either a treatment (like vitamin D) or a placebo, helping eliminate bias and isolate cause-and-effect relationships.
Unfortunately, RCTs on vitamin D and CRC have produced mixed results.
For example, the VITAL trial – a major RCT involving over 25 000 participants – found no significant reduction in overall colorectal cancer incidence with 2000 IU/day of vitamin D supplementation over several years.
However, a meta-analysis of seven RCTs did show a 30% improvement in CRC survival rates with vitamin D supplements, suggesting potential benefits later in the disease course rather than for prevention.
On the other hand, the Vitamin D/Calcium Polyp Prevention Trial found no reduction in the recurrence of adenomas (pre-cancerous growths) with supplementation, raising questions about who benefits most, and at what dosage.
Adding to the uncertainty is the question of causation. Does low vitamin D contribute to cancer development? Or does the onset of cancer reduce vitamin D levels in the body? It’s also possible that the observed benefits are partly due to increased sunlight exposure, which itself may have independent protective effects.
The big picture
These discrepancies highlight the importance of considering the “totality of evidence” – treating each study as one piece of a larger puzzle.
The biologic plausibility is there. Observational and mechanistic studies suggest a meaningful link between vitamin D and lower CRC risk. But the clinical evidence isn’t yet strong enough to recommend vitamin D as a standalone prevention or treatment strategy.
That said, maintaining sufficient vitamin D levels – at least 30ng/mL – is a low-risk, cost-effective health measure. And when combined with other strategies like regular screening, a healthy diet, physical activity, and personalised care, vitamin D could still play a valuable role in overall cancer prevention.
Vitamin D is not a miracle cure – but it is part of a much broader picture. Its role in colorectal cancer is promising but still being defined. While it’s not time to rely on supplements alone, ensuring adequate vitamin D levels – through sun exposure, diet, or supplements – remains a smart choice for your health.
Colorectal cancer is a complex disease, and tackling it requires an equally nuanced approach. For now, that means focusing on evidence-based lifestyle changes, regular screenings, and staying informed as new research unfolds.