Author: ModernMedia

Does Metformin Possibly Help Prevent Dementia?

Created with Gencraft. CC4.0

New research in Diabetes, Obesity and Metabolism reveals that metformin, a medication traditionally prescribed to treat diabetes, is linked to lower risks of dementia and early death.

In the study by investigators at Taipei Medical University that included 452,777 adults with varying degrees of overweight and obesity, 35,784 cases of dementia and 76,048 deaths occurred over 10 years. Metformin users exhibited significantly lower risks of both dementia and all-cause death than nonusers.

The benefits of metformin were seen across all categories of overweight, obesity, and severe obesity, with 8–12% lower risks of dementia and 26–28% lower risks of death.

“Although our study results are promising for metformin’s effects on dementia and mortality, further research is required to explore the mechanisms involved,” said co-corresponding author Chiehfeng Chen, MD, PhD, MPH.

Source: Wiley

“Skin in a Syringe” a Step Towards a New Way to Heal Burns

Researchers in fields such as regenerative medicine and materials science have collaborated to develop a gel containing living cells that can be 3D-printed into a transplant. Photographer: Magnus Johansson

Finding a way to replicate the skin’s complicated dermis layer has long been a goal of healing burn wounds, as it would greatly reduce scarring and restore functionality. Researchers at Linköping University have developed a gel containing living cells that can be 3D-printed onto a transplant, which then sticks to the wound and creates a scaffold for the dermis to grow.

Large burns are often treated by transplanting a thin layer of the top part of the skin, the epidermis, which is basically composed of a single cell type. Transplanting only this part of the skin leads to severe scarring.

“Skin in a syringe”

Beneath the epidermis is the dermis, which has the blood vessels, nerves, hair follicles and other structures necessary for skin function and elasticity. However, transplanting also the dermis is rarely an option, as the procedure leaves a wound as large as the wound to be healed. The trick is to create new skin that does not become scar tissue but a functioning dermis.

“The dermis is so complicated that we can’t grow it in a lab. We don’t even know what all its components are. That’s why we, and many others, think that we could possibly transplant the building blocks and then let the body make the dermis itself,” says Johan Junker, researcher at the Swedish Center for Disaster Medicine and Traumatology and docent in plastic surgery at Linköping University, who led the study published in Advanced Healthcare Materials.

The most common cell type in the dermis, the connective tissue cell or fibroblast, is easy to remove from the body and grow in a lab. The connective tissue cell also has the advantage of being able to develop into more specialised cell types depending on what is needed. The researchers behind the study provide a scaffold by having the cells grow on tiny, porous beads of gelatine, a substance similar to skin collagen. But a liquid containing these beads poured on a wound will not stay there.

The researchers’ solution to the problem is mixing the gelatine beads with a gel consisting of another body-specific substance, hyaluronic acid. When the beads and gel are mixed, they are connected using what is known as click chemistry. The result is a gel that, somewhat simplified, can be called skin in a syringe.

“The gel has a special feature that means that it becomes liquid when exposed to light pressure. You can use a syringe to apply it to a wound, for example, and once applied it becomes gel-like again. This also makes it possible to 3D print the gel with the cells in it,” says Daniel Aili, professor of molecular physics at Linköping University, who led the study together with Johan Junker.

3D-printed transplant

In the current study, the researchers 3D-printed small pucks that were placed under the skin of mice. The results point to the potential of this technology to be used to grow the patient’s own cells from a minimal skin biopsy, which are then 3D-printed into a graft and applied to the wound.

“We see that the cells survive and it’s clear that they produce different substances that are needed to create new dermis. In addition, blood vessels are formed in the grafts, which is important for the tissue to survive in the body. We find this material very promising,” says Johan Junker.

Blood vessels are key to a variety of applications for engineered tissue-like materials. Scientists can grow cells in three-dimensional materials that can be used to build organoids. But there is a bottleneck as concerns these tissue models; they lack blood vessels to transport oxygen and nutrients to the cells. This means that there is a limit to how large the structures can get before the cells at the centre die from oxygen and nutrient deficiency.

Step towards labgrown blood vessels

The LiU researchers may be one step closer to solving the problem of blood vessel supply. In another article, also published in Advanced Healthcare Materials, the researchers describe a method for making threads from materials consisting of 98 per cent water, known as hydrogels.

“The hydrogel threads become quite elastic, so we can tie knots on them. We also show that they can be formed into mini-tubes, which we can pump fluid through or have blood vessel cells grow in,” says Daniel Aili.

The mini-tubes, or the perfusable channels as the researchers also call them, open up new possibilities for the development of blood vessels for eg, organoids.

Source: Linköping University

Podcast: Could Infrared Light Have Deeper Biological Effects than Believed?

Light transmission through the hand from an 850nm LED source. Because the tissues are relatively
thin compared with the thorax it was possible to map the spectrum here against known biological absorbers.
The images clearly show that deoxygenated blood is a key absorber. Also, bone can not be seen and hence is
relatively transparent at these longer wavelengths. Source: Jeffery et al., Scientific Reports, 2025.

In this podcast, we explore how some infrared wavelengths of sunlight can penetrate the human body – even through clothing – and have a systemic positive impact on physiological functions. Sounds like something out of science fiction, but a recent article published in Scientific Reports has demonstrated this effect in humans.

In this study, exposing the torsos of human participants to 830–860nm infrared light was found to boost mitochondrial function and ATP production. There were notable improvements in vision, despite the eyes being shielded from the infrared light. If infrared light is indeed beneficial, what does this mean for our current way of life, indoors and illuminated by LED lights – which notably lack infrared light?

Less Is More: Low-Dose Olanzapine Curbs Chemo-Induced Nausea Without the Sedation

A recent clinical trial demonstrates 5mg olanzapine’s safety and efficacy for chemotherapy-induced nausea and vomiting

Researchers from Japan found that a 5mg dose of olanzapine, taken after chemotherapy, significantly reduces nausea and vomiting in breast cancer patients, while minimising sedation and cutting costs. This patient-centred approach could reshape global standards for antiemetic care. Credit: Prof Mitsue Saito from Juntendo University, Japan

Chemotherapy-induced nausea and vomiting can severely impact patients’ quality of life and treatment adherence. In a major clinical trial, researchers from Japan tested whether a low, 5mg dose of olanzapine taken at home after chemotherapy could reduce these side effects without causing heavy sedation. The study found that this approach significantly improved outcomes compared to placebo, offering a safer, more affordable strategy that could reshape supportive cancer care, especially in outpatient and resource-limited settings.

Chemotherapy-induced nausea and vomiting are among the most distressing side effects of anti-cancer treatment, particularly for those receiving highly emetogenic regimens such as anthracycline plus cyclophosphamide combinations. This major side effect compromises a patient’s quality of life and willingness to continue therapy. Therefore, there is a crucial need to devise an effective antiemetic management approach for optimizing cancer care and patient well-being.

Against this backdrop, a new study, led by Professor Mitsue Saito and Dr. Hirotoshi Iihara from Japan, was made available online on June 17, 2025, and published in Volume 26, Issue 7 of the journal The Lancet Oncology on July 1, 2025, examined whether a 5mg dose of olanzapine taken at home after chemotherapy could reduce nausea and vomiting in patients with breast cancer while minimising the sedative effects associated with the standard 10mg dose.

“While multiple studies have examined 10mg of olanzapine and confirmed its effectiveness for nausea control, at this dose it often causes sedation, raising safety concerns,” explains lead author Prof Saito. “Beyond the commonly observed sedation, olanzapine at the 10 mg dose can cause serious adverse effects, including sedative effects such as daytime sleepiness and loss of consciousness.”

“The study was inspired in part by three patients with breast cancer who attended an antiemetic guideline meeting at MASCC 2015 in Copenhagen. They spoke about the burdensome sedative side effects of olanzapine, a concern that helped shape the trial’s patient-centred design,” says Prof Saito.

This phase 3, double-blind, placebo-controlled trial enrolled 500 female patients with breast cancer in Japan receiving outpatient anthracycline plus cyclophosphamide-based chemotherapy. Participants were randomly assigned to receive either olanzapine 5mg or placebo in combination with standard triplet antiemetic therapy (palonosetron, dexamethasone, and an NK-1 receptor antagonist). The olanzapine 5mg was taken at home after chemotherapy to help avoid sedation during hospital travel or treatment.

“This study uniquely investigates the timing of olanzapine 5mg administration, given within 5 hours post-chemotherapy administration and before the evening meal, to reduce sedation during hospital visits and transportation. This approach takes into account the onset of nausea and vomiting reported in previous studies. Among highly emetogenic chemotherapies, there is a significant difference between cisplatin, which usually requires hospitalisation for treatment, and other chemotherapies such as anthracycline-based regimens that are typically administered on an outpatient basis,” says Dr Iihara. The primary endpoint of the study was to investigate the proportion of patients achieving complete response, defined as no vomiting and no rescue medication use during the overall phase (0–120 hours post-anthracycline plus cyclophosphamide initiation).

The results demonstrated significant improvement, with 58.1% of patients in the olanzapine 5mg group achieving a complete response during the first 5 days after chemotherapy, compared to only 35.5% in the placebo group. Benefits also extended to delayed nausea and vomiting across a 7-day observation period.

While some patients reported drowsiness, the incidence of severe or very severe concentration impairment was low, occurring in 10% of patients in the olanzapine 5mg group vs 14% in the placebo group. Additionally, no major adverse events were observed in either group, indicating that there were no treatment-related deaths in either group.

The olanzapine 5mg dose offers an important financial and clinical advantage over the commonly used 10mg. By reducing side effects and cost, this strategy may make antiemetic treatment more accessible, particularly in lower-resource settings.

These new findings suggest that an olanzapine 5mg regimen, especially when administered after chemotherapy, can be just as effective, with fewer side effects. Although the study focused on Japanese women with breast cancer, the results are expected to influence international practices and future guideline updates.

In addressing both physical and financial toxicity and putting patients’ voices at the centre of the research, this trial represents more than a treatment tweak. It’s a step toward more humane, equitable cancer care.

Source: Juntendo University

Scientists Grow Novel ‘Whole-brain’ Organoid

Image from Pixabay.

Johns Hopkins University researchers have grown a novel whole-brain organoid, complete with neural tissues and rudimentary blood vessels, in an advance that could usher in a new era of research into neuropsychiatric disorders such as autism.

“We’ve made the next generation of brain organoids,” said senior author Annie Kathuria, an assistant professor in JHU’s Department of Biomedical Engineering who studies brain development and neuropsychiatric disorders. “Most brain organoids that you see in papers are one brain region, like the cortex or the hindbrain or midbrain. We’ve grown a rudimentary whole-brain organoid; we call it the multi-region brain organoid (MRBO).”

The research, published in Advanced Science, marks one of the first times scientists have been able to generate an organoid with tissues from each region of the brain connected and acting in concert. Having a human cell-based model of the brain will open possibilities for studying schizophrenia, autism, and other neurological diseases that affect the whole brain – work that typically is conducted in animal models.

To generate a whole-brain organoid, Kathuria and members of her team first grew neural cells from the separate regions of the brain and rudimentary forms of blood vessels in separate lab dishes. The researchers then stuck the individual parts together with sticky proteins that act as a biological superglue and allowed the tissues to form connections. As the tissues began to grow together, they started producing electrical activity and responding as a network.

Much smaller compared to a real brain – weighing in at 6 million to 7 million neurons compared with tens of billions in adult brains – these organoids provide a unique platform on which to study whole-brain development.

The researchers also saw the creation of an early blood–brain barrier formation, a layer of cells that surround the brain and control which molecules can pass through.

“We need to study models with human cells if you want to understand neurodevelopmental disorders or neuropsychiatric disorders, but I can’t ask a person to let me take a peek at their brain just to study autism,” Kathuria said. “Whole-brain organoids let us watch disorders develop in real time, see if treatments work, and even tailor therapies to individual patients.”

Using whole-brain organoids to test experimental drugs may also help improve the rate of clinical trial success, researchers said. Roughly 85% to 90% of drugs fail during Phase 1 clinical trials. For neuropsychiatric drugs, the fail rate is closer to 96%. This is because scientists predominantly study animal models during the early stages of drug development. Whole-brain organoids more closely resemble the natural development of a human brain and likely will make better test subjects.

“Diseases such as schizophrenia, autism, and Alzheimer’s affect the whole brain, not just one part of the brain. If you can understand what goes wrong early in development, we may be able to find new targets for drug screening,” Kathuria said. “We can test new drugs or treatments on the organoids and determine whether they’re actually having an impact on the organoids.”

Source: Johns Hopkins Medicine

Study Uncovers Large Burden of Potentially Preventable Hospitalisations for Pneumococcal Pneumonias

This illustration depicts a 3D computer-generated image of a group of Gram-positive, Streptococcus pneumoniae bacteria. The artistic recreation was based upon scanning electron microscopic (SEM) imagery. Credit: CDC on Unsplash

In a recent multicentre prospective study conducted at three hospitals in Tennessee and Georgia, including Vanderbilt University Medical Center, researchers at VUMC found a substantial burden of hospitalisations for community-acquired pneumonia (CAP) among adults. 

Community-acquired pneumonia refers to a case of the disease contracted without prior exposure to a health care setting, otherwise known as hospital-acquired pneumonia (HAP). 

The study, published in JAMA Network Open, included data from 2018 to 2022 and used a novel serotype-specific urinary test that can identify infections caused by 30 different Streptococcus pneumoniae serotypes. A serotype refers to a distinct strain of microorganism, such as bacteria. 

An important aspect of the study was the identification of noninvasive pneumococcal infections, said Carlos Grijalva, MD, MPH, professor of Health Policy and Biomedical Informatics and the study’s lead author. 

“Standard clinical diagnostic methods such as bacterial cultures of blood are helpful for identifying invasive cases of pneumococcal disease, but the majority of pneumococcal pneumonias are thought to be noninvasive,” Grijalva added. “Using a novel and more sensitive urinary antigen detection method allowed us to identify a number of pneumococcal infections that may have otherwise passed unrecognised.” 

Based on current population estimates, some 114 800 U.S. adults may be hospitalised for pneumococcal pneumonia each year, a figure made up in large part by older adults. And according to the study’s findings, each year sees approximately 340 hospitalisations for community-acquired pneumonia per 100 000 adults, approximately 14% of which had evidence of Streptococcus pneumoniae infection. 

“Our study results show that Streptococcus pneumoniae remains an important cause of severe community-acquired pneumonia,” said Wesley Self, MD, MPH, professor of Emergency Medicine, Senior Vice President for Clinical Research and the paper’s senior author. 

Many of the serotypes identified by pneumococcal detections corresponded with those covered by a recently licensed adult-specific pneumococcal conjugate vaccine, V116, which includes 21 serotypes but was not commercially available during the study period. 

“Vaccines with coverage of additional pneumococcal serotypes could be quite beneficial in lessening the burden of severe pneumonia on the U.S. population, especially among older adults,” added Self, who holds the Directorship in Emergency Care Research. 

Source: Vanderbilt University Medical Center

Healthy Habits, Better Hair: How Lifestyle Choices Impact Hair Restoration

Dr Kashmal Kalan urges patients to prioritise health before surgery – and offers hope to those recovering from illness

Hair loss is often viewed as a cosmetic concern, but emerging clinical insights confirm what many medical professionals have long understood: overall health is one of the most significant contributors to hair loss, and a crucial factor in whether hair restoration procedures succeed.

According to Dr Kashmal Kalan, Medical Director at Alvi Armani South Africa, chronic conditions such as diabetes, hypertension, and high cholesterol are closely linked to diffuse hair thinning, particularly when left undiagnosed or poorly managed.

“These conditions disrupt blood flow, create oxidative stress, and limit nutrient supply to the hair follicles. That directly affects hair growth and viability, especially in patients with a genetic predisposition to balding.”

While pattern baldness is widely understood, lifestyle factors are often overlooked until the condition becomes advanced. “People are often surprised to learn that their smoking, alcohol use, stress levels, or even recreational drug use may be accelerating their hair loss or interfering with their recovery.”

In fact, undisclosed drug use can compromise not only natural regrowth but also post-surgical outcomes. “We’ve seen poor graft uptake and higher complication rates in these cases. That’s why our pre-surgical assessments are so thorough. We need full transparency to ensure patient safety and the best possible results.”

Hair restoration is a medical procedure, not a cosmetic quick fix – and a patient’s internal health matters just as much as surgical precision. At Alvi Armani South Africa, all patients undergo full blood work and health screening before being approved for surgery.

“This is vital not only for safety, but often for diagnosis. Hair loss can sometimes be the first visible symptom of an underlying condition. Through our screenings, we’ve detected cases of unmanaged diabetes, hypertension, and even early autoimmune markers.” 

Even once cleared for surgery, long-term success requires commitment from both doctor and patient. “The patient’s role is just as important as the surgeon’s. They need to maintain their health so the body can heal and support strong, sustainable regrowth.”

 In July, Alvi Armani South Africa announced a partnership with the Cancer Association of South Africa (CANSA), offering free consultations and personalised advice to cancer survivors – many of whom face permanent scarring or delayed hair regrowth after treatment.

 “Hair loss after cancer goes far deeper than appearance,” he notes. “It impacts confidence, identity, and how survivors re-enter everyday life. The good news for survivors is that minimally invasive Follicular Unit Extraction (FUE) techniques can provide an effective pathway to emotional and physical restoration – but only when the body is ready.”

 For those in earlier stages of hair loss, early intervention is key. “If the cause is lifestyle-related, healthier habits can help. If it’s genetic, medications or non-surgical treatments may stabilise the loss, sometimes delaying or even eliminating the need for surgery. But ultimately, it’s simple: healthy hair starts with a healthy body.

 “We can deliver technically flawless procedures, but healing still depends on the patient. When people approach hair restoration with the same seriousness as any other medical treatment, the results – and their overall wellbeing – are far better,” concludes Dr Kalan.

New Discovery Reveals the Spinal Cord’s Role in Bladder Control

Urinary incontinence. Credit: Scientific Animations CC4.0

Urinary incontinence is a devastating condition, leading to significant adverse impacts on patients’ mental health and quality of life. Disorders of urination are also a key feature of all neurological disorders.

A USC research team has now made major progress in understanding how the human spinal cord triggers the bladder emptying process. The discovery could lead to exciting new therapies to help patients regain control of this essential function.

In the pioneering study, a team from USC Viterbi School of Engineering and Keck School of Medicine of USC has harnessed functional ultrasound imaging to observe real-time changes in blood flow dynamics in the human spinal cord during bladder filling and emptying.

The work was published in Nature Communications and was led by Charles Liu, the USC Neurorestoration Center director at Keck School of Medicine of USC and professor of biomedical engineering at USC Viterbi, and Vasileios Christopoulos, assistant professor at the Alfred E. Mann Department of Biomedical Engineering.

The spinal cord regulates many essential human functions, including autonomic processes like bladder, bowel, and sexual function. These processes can break down when the spinal cord is damaged or degenerated due to injury, disease, stroke, or aging. However, the spinal cord’s small size and intricate bony enclosure have made it notoriously challenging to study directly in humans.

Unlike in the brain, routine clinical care does not involve invasive electrodes and biopsies in the spinal cord due to the obvious risks of paralysis.

Furthermore, fMRI imaging, which comprises most of human functional neuroimaging, does not exist in practical reality for the spinal cord, especially in the thoracic and lumbar regions where much of the critical function localises.

“The spinal cord is a very undiscovered area,” Christopoulos said. “It’s very surprising to me because when I started doing neuroscience, everybody was talking about the brain. And Dr. Liu and I asked, “What about the spinal cord?”

“For many, it was just a cable that transfers information from the brain to the peripheral system. The truth was that we didn’t know how to go there—how to study the spinal cord in action, visualize its dynamics and truly grasp its role in physiological functions.”

Functional ultrasound imaging: A new window into the spinal cord

To overcome these barriers, the USC team employed functional ultrasound imaging (fUSI), an emerging neuroimaging technology that is minimally invasive. The fUSI process allowed the team to measure where changes in blood volume occur on the spinal cord during the cycle of urination.

However, fUSI requires a “window” through the bone to image the spinal cord. The researchers found a unique opportunity by working with a group of patients undergoing standard-of-care epidural spinal cord stimulation surgery for chronic low back pain.

“During the implantation of the spinal cord stimulator, the window we create in the bone through which we insert the leads gives us a perfect and safe opportunity to image the spinal cord using fUSI with no risk or discomfort to the study volunteers,” said co-first author Darrin Lee, associate director of the USC Neurorestoration Center, who performed the surgeries.

“While the surgical team was preparing the stimulator, we gently filled and emptied the bladder with saline to simulate a full urination cycle under anaesthesia while the research team gathered the fUSI data,” added Evgeniy Kreydin from the Rancho Los Amigos National Rehabilitation Center and the USC Institute of Urology, who was already working closely with Liu to study the brain of stroke patients during micturition using fMRI.

“This is the first study where we’ve shown that there are areas in the spinal cord where activity is correlated with the pressure inside the bladder,” Christopoulos said.

“Nobody had ever shown a network in the spinal cord correlated with bladder pressure. What this means is I can look at the activity of your spinal cord in these specific areas and tell you your stage of the bladder cycle – how full your bladder is and whether you’re about to urinate.”

Christopoulos said the experiments identified that some spinal cord regions showed positive correlation, meaning their activity increased as bladder pressure rose, while others showed negative (anti-correlation), with activity decreasing as pressure increased. This suggests the involvement of both excitatory and inhibitory spinal cord networks in bladder control.

“It was extremely exciting to take data straight from the fUSI scanner in the OR to the lab, where advanced data science techniques quickly revealed results that have never been seen before, even in animal models, let alone in humans,” said co-first author Kofi Agyeman, biomedical engineering postdoc.

New hope for patients

Liu has worked for two decades at the intersection of engineering and medicine to develop transformative strategies to restore function to the nervous system. Christopoulos has spent much of his research career developing neuromodulation techniques to help patients regain motor control.

Together, they noted that for patients, retaining control of the autonomic processes that many of us take for granted is more fundamental than even walking.

“If you ask these patients, the most important function they wanted to restore was not their motor or sensory function. It was things like sexual function and bowel and bladder control,” Christopoulos said, noting that urinary dysfunction often leads to poor mental health. “It’s a very dehumanising problem to deal with.”

Worse still, urinary incontinence leads to more frequent urinary tract infections (UTIs) because patients must often be fitted with a catheter. Due to limited sensory function, they may not be able to feel that they have an infection until it is more severe and has spread to the kidneys, resulting in hospitalisation.

This study offers a tangible path toward addressing this critical need for patients suffering from neurogenic lower urinary tract dysfunction. The ability to decode bladder pressure from spinal cord activity provides proof-of-concept for developing personalised spinal cord interfaces that could warn patients about their bladder state, helping them regain control.

Currently, almost all neuromodulation strategies for disorders of micturition are focused on the lower urinary tract, largely because the neural basis of this critical process remains unclear.

“One has to understand a process before one can rationally improve it,” Liu said.

This latest research marks a significant step forward, opening new avenues for precision medicine interventions that combine invasive and noninvasive neuromodulation with pharmacological therapeutics to make neurorestoration of the genitourinary system a clinical reality for millions worldwide.

Source: University of Southern Carolina

Researchers Debunk Concerns over Common Flu Antiviral in Children

Photo by Andrea Piacquadio on Unsplash

For decades, medical professionals debated whether a common antiviral medication used to treat flu in children caused neuropsychiatric events or if the infection itself was the culprit.

Now researchers at Monroe Carell Jr. Children’s Hospital at Vanderbilt have debunked a long-standing theory about oseltamivir, known as Tamiflu.

According to the study, published in JAMA Neurology, oseltamivir treatment during flu episodes was associated with a reduced risk of serious neuropsychiatric events, such as seizures, altered mental status and hallucination.

“Our findings demonstrated what many pediatricians have long suspected, that the flu, not the flu treatment, is associated with neuropsychiatric events,” said principal investigator James Antoon, MD, PhD, MPH, assistant professor of Pediatrics in the Division of Pediatric Hospital Medicine at Monroe Carell. “In fact, oseltamivir treatment seems to prevent neuropsychiatric events rather than cause them.”

Key points:

  • Influenza itself was associated with an increase in neuropsychiatric events compared to children with no influenza, regardless of oseltamivir use.
  • Among children with influenza, those treated with oseltamivir had about 50% reduction in neuropsychiatric events.
  • Among children without influenza, those who were treated with oseltamivir prophylactically had the same rate of events as the baseline group with no influenza.

“Taken together, these three findings do not support the theory that oseltamivir increases the risk of neuropsychiatric events,” said Antoon. “It’s the influenza.”

The team reviewed the de-identified data from a cohort of children and adolescents ages 5-17 who were enrolled in Tennessee Medicaid between July 1, 2016, and June 30, 2020.

During the four-year period, 692 295 children, with a median age of 11 years, were included in the study cohort. During follow-up, study children experienced 1230 serious neuropsychiatric events (898 neurologic and 332 psychiatric).

The clinical outcomes definition included both neurologic (seizures, encephalitis, altered mental status, ataxia/movement disorders, vision changes, dizziness, headache, sleeping disorders) and psychiatric (suicidal or self-harm behaviours, mood disorders, psychosis/hallucination) events.

“The 2024-2025 influenza season highlighted the severity of influenza-associated neurologic complications, with many centres reporting increased frequency and severity of neurologic events during the most recent season,” said Antoon. “It is important for patients and families to know the true risk-benefit profile of flu treatments, such as oseltamivir, that are recommended by the American Academy of Pediatrics.”

“These flu treatments are safe and effective, especially when used early in the course of clinical disease,” added senior author Carlos Grijalva, MD, MPH, professor of Health Policy and Biomedical Informatics at Vanderbilt University Medical Center.

Investigators hope the findings will provide reassurance to both caregivers and medical professionals about the safety of oseltamivir and its role in preventing flu-associated complications.

Source: Vanderbilt University Medical Center

The Type 2 Diabetes Risk from Potatoes May Hinge on Their Preparation

Photo by Mitchell Luo on Unsplash

French fries were associated with an increased risk of developing type 2 diabetes (T2D), while other forms of potatoes – including baked, boiled, and mashed – were not, according to a new study led by Harvard T.H. Chan School of Public Health. The study also found that swapping any form of potato for whole grains may lower the risk of T2D.

The study was published July 30 in the BMJ. [Regrettably, no mention is made of SA’s beloved slap tjips – Ed.]

According to the researchers, while previous studies hinted at a link between potatoes and T2D, the evidence was inconsistent and often lacked detail on cooking methods and the potential effects of substituting other foods for potatoes. “Our study offers deeper, more comprehensive insights by looking at different types of potatoes, tracking diet over decades, and exploring the effects of swapping potatoes for other foods,” said lead author Seyed Mohammad Mousavi, postdoctoral research fellow in the Department of Nutrition. “We’re shifting the conversation from, ‘Are potatoes good or bad?’ to a more nuanced—and useful—question: How are they prepared, and what might we eat instead?”

The researchers examined the diets and diabetes outcomes of 205,107 men and women enrolled in the Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study. For more than 30 years, participants regularly responded to dietary questionnaires, detailing the frequency with which they consumed certain foods, including French fries; baked, boiled, or mashed potatoes; and whole grains. They also reported on new health diagnoses, including T2D, and various other health, lifestyle, and demographic factors, which the researchers controlled for. Over the course of the study period, 22,299 participants reported that they developed T2D.

The study found that three servings weekly of French fries increased the risk of developing T2D by 20%. Baked, boiled, and mashed potatoes were not significantly associated with T2D risk. The researchers calculated, however, that eating whole grains – such as whole grain pasta, bread, or farro – in place of baked, boiled, or mashed potatoes could reduce the risk of T2D by 4%. Replacing French fries with whole grains could bring T2D risk down by 19%. Even swapping refined grains for French fries was estimated to lower T2D risk.

The researchers complemented their study with a novel meta-analytic approach to estimate how swapping potatoes for whole grains could affect the risk of T2D, using data from previously published cohort studies. This involved two separate meta-analyses: one based on data from 13 cohorts examining potato intake and the other from 11 cohorts on whole grain intake, each encompassing over 500 000 participants and 43 000 T2D diagnoses across four continents. The results were closely consistent with those of the new study.

“The public health message here is simple and powerful: Small changes in our daily diet can have an important impact on risk of type 2 diabetes. Limiting potatoes – especially limiting French fries – and choosing healthy, whole grain sources of carbohydrate could help lower the risk of type 2 diabetes across the population,” said corresponding author Walter Willett, professor of epidemiology and nutrition. “For policymakers, our findings highlight the need to move beyond broad food categories and pay closer attention to how foods are prepared and what they’re replacing. Not all carbs—or even all potatoes—are created equal, and that distinction is crucial when it comes to shaping effective dietary guidelines.”

Source: Harvard T.H. Chan School of Public Health