New Study Reveals Promising Drug Target for Osteoporosis Treatment

Photo by Mehmet Turgut Kirkgoz on Unsplash

In a recent study published in Journal of Cellular Physiology, researchers from Tokyo University of Science discovered a new target for the treatment of osteoporosis, which is responsible for 8.9 million fractures globally each year. They focused on improving a common bone-strengthening drug, teriparatide, which has a tendency to also increase bone resorption. By targeting a newly identified gene, they were able to suppress teriparatide’s bone resorption effect.

Induction of parathyroid hormone (PTH) signalling using the synthetic PTH-derived peptide – teriparatide, has demonstrated strong bone-promoting effects in patients with osteoporosis. These effects are mediated by osteogenesis, the process of bone formation involving the differentiation and maturation of bone-forming cells called osteoblasts. However, PTH induction is also associated with the differentiation of macrophages into osteoclasts, which resorb bone. Although, bone remodelling by osteoblasts and osteoclasts is crucial for maintaining skeletal health, PTH-induced osteoclast differentiation can decrease treatment efficacy in patients with osteoporosis. However, precise molecular mechanisms underlying the dual action of PTH signaling in bone remodelling are not well understood.

To bridge this gap, Professor Tadayoshi Hayata and Ms Chisato Sampei, from Tokyo University of Science, along with their colleagues, conducted a series of experiments to identify druggable target genes downstream of PTH signalling in osteoblasts. Explaining the rationale behind their study , corresponding author, Prof. Hayata says, “In Japan, it is estimated that 12.8 million people, or one in ten people, suffer from osteoporosis, which can significantly deteriorate their quality of life. Teriparatide is classified as a drug that promotes bone formation, but it also promotes bone resorption, which may limit bone formation. However, the full scope of its pharmacological action remains unknown.”

The researchers treated cultured mouse osteoblast cells and mice with teriparatide. They then assessed gene expression changes induced by PTH in both the cultured cells and bone cells isolated from the femurs of the treated animals, using advanced RNA-sequencing analysis. Among several upregulated genes, they identified a novel PTH-induced gene – ‘Gprc5a’, encoding an orphan G protein-coupled receptor, which has been previously explored as a therapeutic target. However, its precise role in osteoblast differentiation had not been fully understood.

PTH induction has been known to activate the cyclic adenosine monophosphate (cAMP) and protein kinase C (PKC) signaling pathways. Interestingly, the team found that in addition to PTH induction, activation of cAMP and PKC also resulted in overexpression of Gprc5a, albeit to a lesser extent, underscoring the potential involvement of other molecular pathways. Notably, upregulation of Gprc5a was suppressed upon inhibition of transcription, but, remained unaffected upon suppressing protein synthesis, suggesting that Gprc5a could be transcribed early on in response to PTH signaling and serves as a direct target gene.

Furthermore, the researchers examined the effect of Gprc5a downregulation on osteoblast proliferation and differentiation. Notably, while PTH induction alone did not affect cell proliferation, Gprc5a knockdown resulted in an increase in the expression of cell-cycle-related genes and osteoblast differentiation markers. These findings suggest that Gprc5a suppresses osteoblast proliferation and differentiation.

Diving deeper into the molecular mechanisms underlying the effects of Gprc5a, in PTH-induced osteogenesis, the researchers identified Activin receptor-like kinase 3 (ALK3) – a bone morphogenetic protein (BMP) signalling pathway receptor, as an interacting partner of Gprc5a. In line with their speculation, overexpression of Gprc5a indeed, led to suppression of BMP signalling via receptors including ALK3.

Overall, these findings reveal that Gprc5a – a novel inducible target gene of PTH, negatively regulates osteoblast proliferation and differentiation, by partially suppressing BMP signaling. Gprc5a can thus, be pursued as a novel therapeutic target while devising treatments against osteoporosis. The study sheds light on the complex process of bone remodeling and explains the bone-promoting and bone-resorbing effects of PTH signaling.

“Our study shows Gprc5a may function as a negative feedback factor for the bone formation promoting effect of teriparatide. Suppressing Gprc5a function may, therefore, increase the effectiveness of teriparatide in non-responding patients. In the future, we hope that our research will lead to improved quality of life and healthy longevity for people suffering from osteoporosis,” concludes Prof Hayata.

Source: Tokyo University of Science

How Stress Saps Cognitive Reserves, Increasing Dementia Risk

Photo by Alex Green on Pexels

While mentally stimulating activities and life experiences can improve cognition in memory clinic patients, stress undermines this beneficial relationship. This is according to a new study published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

Researchers in the late 1980s found that some individuals who showed no apparent symptoms of dementia during their lifetime had brain changes consistent with an advanced stage of Alzheimer’s disease. 

It has since been postulated that so-called cognitive reserve might account for this differential protective effect in individuals. 

Cognitively stimulating and enriching life experiences and behaviours such as higher educational attainment, complex jobs, continued physical and leisure activities, and healthy social interactions help build cognitive reserve. 

Increased risk of dementia

However, high or persistent stress levels are associated with reduced social interactions, impaired ability to engage in leisure and physical activities, and an increased risk of dementia.  

Researchers from Karolinska Institutet have now examined the association between cognitive reserve, cognition, and biomarkers for Alzheimer’s disease in 113 participants from the memory clinic at the Karolinska University Hospital, Huddinge, Sweden. 

They also examined how this association is modified by physiological stress (cortisol levels in saliva) and psychological (perceived) stress. 

Greater cognitive reserve was found to improve cognition, but interestingly, physiological stress appeared to weaken the association.  

“These results might have clinical implications as an expanding body of research suggests that mindfulness exercises and meditation may reduce cortisol levels and improve cognition,” says the study’s lead author Manasa Shanta Yerramalla, researcher at the Department of Neurobiology, Care Sciences and Society. “Different stress management strategies could be a good complement to existing lifestyle interventions in Alzheimer’s prevention.” 

The relatively small sample of participants reduces the possibility of drawing robust conclusions, but the results are generalisable to similar patient groups.  

Link between sleep and cognition 

Moreover, since stress disrupts sleep, which in turn disrupts cognition, the researchers controlled for sleeping medications; they did not, however, consider other aspects of sleep that might impair cognition. 

“We will continue to study the association between stress and sleeping disorders and how it affects the cognitive reserve in memory clinic patients,” says Dr Yerramalla. 

Source: Karolinska Institutet

First Menstrual Periods are Arriving Earlier for Younger Generations

Photo by Marta Branco

The average age at menarche, the first menstrual period, has been decreasing among younger generations in the US, especially those belonging to racial minorities and lower socioeconomic statuses, according to a new study led by researchers at Harvard T.H. Chan School of Public Health. It also found that the average time it takes for the menstrual cycle to become regular is increasing.

The study, published in JAMA Network Open, is the latest publication from the Apple Women’s Health Study, a longitudinal study of menstrual cycles, gynaecological conditions, and overall women’s health conducted by Harvard Chan School, the National Institute of Environmental Health Sciences, and Apple.

“Our findings can lead to a better understanding of menstrual health across the lifespan and how our lived environment impacts this critical vital sign,” said co-principal investigator Shruthi Mahalingaiah, assistant professor of environmental, reproductive, and women’s health at Harvard Chan School.

While previous studies have shown trends towards earlier menarche over the past five decades, data has been limited on how these trends present within different racial groups and socioeconomic statuses. Additionally, few studies have had sufficient data to identify any trends regarding time to menstrual cycle regularity.

The researchers used the Apple Women’s Health Study’s large, diverse dataset to fill this research gap. The 71 341 participants who enrolled between November 2018 and March 2023 self-reported the age at which they first began menstruating and their race and socioeconomic status. The researchers divided the participants into five age brackets: born between 1950–1969, 1970–1979, 1980–1989, 1990–1999, and 2000-2005. Ages of menarche were defined as early (younger than 11 years old), very early (younger than 9), and late (ages 16 and above). A subset of participants (61 932) self-reported the time it took for their menstrual cycle to become regular and were divided into five categories: up to two years, between three and four years, longer than five years, hasn’t become regular, or became regular with use of hormones. Another subset (9865) provided their body mass index (BMI) at their age of menarche.

The study found that as birth year increased (meaning younger participants), average age at menarche decreased and time from menarche to menstrual cycle regularity increased. Among participants born from 1950–1969, the average age at menarche was 12.5 years, and the rates of early and very early menarche were 8.6% and 0.6%, respectively. Among participants born from 2000–2005, the average age of menarche was 11.9 years, and the rates of early and very early menarche were 15.5% and 1.4%, respectively. Across the two groups, the percentage of participants who reached menstrual cycle regularity within two years of menarche decreased from 76% to 56%. The researchers observed that these trends were present among all sociodemographic groups but were most pronounced among the participants who identified as Black, Hispanic, Asian, or mixed race, and who rated themselves as belonging to a low socioeconomic status.

The findings showed that BMI at age of menarche could explain part of the trend toward periods starting earlier. Other possible factors that might explain the trend include dietary patterns, psychological stress and adverse childhood experiences, and environmental factors such as endocrine-disrupting chemicals and air pollution.

“Continuing to investigate early menarche and its drivers is critical,” said corresponding author Zifan Wang, postdoctoral research fellow in Harvard Chan School’s Department of Environmental Health. “Early menarche is associated with higher risk of adverse health outcomes, such as cardiovascular disease and cancer. To address these health concerns – which our findings suggest may begin to impact more people, with disproportionate impact on already disadvantaged populations – we need much more investment in menstrual health research.”

The authors noted some limitations to the study, including that it relies heavily on retrospective self-reporting.

Source: Harvard T.H. Chan School of Public Health

Women’s Mental Agility is Better During Their Periods

Photo by Ashley Williams

New research involving female football players has shown that they react more quickly and accurately during their periods, despite them feeling that they perform worse. The study, published in Neuropsychologia, is the first to assess sport-related cognition during the menstrual cycle and is part of a larger research project supported by the FIFA Research Scholarship.

The findings, from University College London, act as a proof-of-principle that specific types of cognition fluctuate throughout the menstrual cycle, which could have implications for injury and other aspects of women’s health.

Previous sports medicine research has shown that women seem to be at greater risk of sport-related injury during the luteal phase, which is the time between ovulation and menstruation. This is possibly related to the significant hormonal changes that occur throughout the menstrual cycle. But precisely how these changes are linked to an increased likelihood of injury are unknown at present.

In this study, researchers at UCL and ISEH collected reaction time and error data from 241 participants who completed a battery of cognitive tests 14 days apart. Participants also completed a mood scale and a symptom questionnaire twice. Period-tracking apps were used to estimate which phase of their cycle the participants were in when they took the tests.

The tests were designed to mimic mental processes that are typical in team sports. In one test, participants were shown smiling or winking faces and asked to press the space bar only when they saw a smiley face, to test inhibition, attention, reaction time and accuracy. In another, they were asked to identify mirror images in a 3D rotation task, which assesses spatial cognition. A task that asked them to click when two moving balls collide on screen measured spatial timing.

Though participants reported feeling worse during menstruation and perceived that this negatively impacted their performance, their reaction times were faster and they made fewer errors. For example, their timing was on average 10 milliseconds (12%) more accurate in the moving balls task, and they pressed the space bar at the wrong time 25% less in the inhibition task.

Participants’ reaction times were slower during the luteal phase, which begins after ovulation and lasts between 12–14 days up to the beginning of menstruation. They were on average 10–20 milliseconds slower compared to being in any other phase, but their error rate was unchanged.

Dr Flaminia Ronca, first author of the study from UCL Division of Surgery and Interventional Science and ISEH, said: “Research suggests that female athletes are more likely to sustain certain types of sports injuries during the luteal phase and the assumption has been that this is due to biomechanical changes as a result of hormonal variation. But I wasn’t convinced that physical changes alone could explain this association.

“Given that progesterone has an inhibitory effect on the cerebral cortex and oestrogen stimulates it, making us react slower or faster, we wondered if injuries could be a result of a change in athletes’ timing of movements throughout the cycle.

“What is surprising is that the participant’s performance was better when they were on their period, which challenges what women, and perhaps society more generally, assume about their abilities at this particular time of the month.

“I hope that this will provide the basis for positive conversations between coaches and athletes about perceptions and performance: how we feel doesn’t always reflect how we perform.”

To put the findings in context, the authors say the fluctuation in timing could be the difference between an injury or not. Previous research has shown that a variation of just 10 milliseconds can mean the difference between a concussion and a lesser injury, for example. In the colliding balls task, participants’ timing was on average 12 milliseconds slower during the luteal phase compared to every other phase, a difference of 16%.

Dr Megan Lowery, an author of the study from UCL Surgery & Interventional Science and ISEH, said: “There’s lots of anecdotal evidence from women that they might feel clumsy just before ovulation, for example, which is supported by our findings here. My hope is that if women understand how their brains and bodies change during the month, it will help them to adapt.

“Though there’s a lot more research needed in this area, these findings are an important first step towards understanding how women’s cognition affects their athletic performance at different points during their cycle, which will hopefully facilitate positive conversations between coaches and athletes around performance and wellbeing.”

Professor Paul Burgess, senior author of the study from UCL’s Institute of Cognitive Neuroscience, said: “This study emerged from listening carefully to female soccer players and their coaches. We created bespoke cognitive tests to try to mimic the demands made upon the brain at the points in the game where they were telling us that injuries and problems of timing occur at certain times of the menstrual cycle.

“As suggested by what the soccer players had told us, the data suggested that women who menstruate – whether they are athletes or not – do tend to vary in their performance at certain stages of the cycle. As a neuroscientist, I am amazed that we don’t already know more about this, and hope that our study will help motivate increasing interest in this vital aspect of sports medicine.”

Source: University College London

Scarring after Spinal Cord Injury is More Complex than Previously Thought

Fibrotic scar 14d after spinal cord injury, red – Col1a1+ perivascular fibroblast derived cells Photo: Daniel Holl

New research has found that scar formation after spinal cord injuries is more complex than previously thought. Scientists at Karolinska Institutet have identified two types of perivascular cells as key contributors to scar tissue, which hinders nerve regeneration and functional recovery. These findings, published in Natural Neuroscience, are also relevant for other brain and spinal cord injuries and could lead to targeted therapies for reducing scarring and improving outcomes.

The central nervous system (CNS) has very limited healing abilities. Injuries or autoimmune diseases like multiple sclerosis often lead to permanent functional deficits. 

Regardless of the injury’s cause, the body responds by forming a boundary around the damaged tissue, which eventually becomes permanent scar tissue. 

Two contributing cell types

While scar tissue seals the damaged area, it also prevents functional repair. After spinal cord injuries, scar tissue blocks the regeneration of nerve fibers that connect the brain with the body, resulting in paralysis after severe injuries.

The research team led by Christian Göritz at Karolinska Institutet has made significant progress in understanding how scar tissue forms in the CNS. The group now identified two distinct types of perivascular cells, which line different parts of blood vessels, as the major contributors to fibrotic scar tissue after spinal cord injury. Depending on the lesion’s location, the two identified cell types contribute differently.

“We found that damage to the spinal cord activates perivascular cells close to the damaged area and induces the generation of myofibroblasts, which consequently form persistent scar tissue,” explains first author Daniel Holl, researcher at the Department of Cell and Molecular Biology.

By examining the process of scar formation in detail, the researchers hope to identify specific therapeutic targets to control fibrotic scarring.

New Ultrasound and Genetics Combination Precisely Targets Neurons in Diseased Regions

McKelvey School of Engineering researchers have developed a noninvasive technology combining a holographic acoustic device with genetic engineering that allows them to precisely target affected neurons in the brain, creating the potential to precisely modulate selected cell types in multiple diseased brain regions. (Credit: Yaoheng Yang)

Brain diseases such as Parkinson’s disease involve damage in more than one region of the brain, requiring technology that could precisely and flexibly address all affected regions simultaneously. Researchers have developed a noninvasive technology combining a holographic acoustic device with genetic engineering that allows them to precisely target affected neurons in the brain. This has the potential to precisely modulate selected cell types in multiple diseased brain regions. 

Hong Chen, associate professor of biomedical engineering and neurosurgery at Washington University in St. Louis and her team created AhSonogenetics, or Airy-beam holographic sonogenetics, a technique that uses a noninvasive wearable ultrasound device to alter genetically selected neurons in the brains of mice. Results of the proof-of-concept study were published in Proceedings of the National Academy of Sciences

AhSonogenetics brings together several of Chen’s group’s recent advances into one technology. In 2021, she and her team launched Sonogenetics, a method that uses focused ultrasound to deliver a viral construct containing ultrasound-sensitive ion channels to genetically selected neurons in the brain. They use low-intensity focused ultrasound to deliver a small burst of warmth, which opens the ion channels and activates the neurons. Chen’s team was the first to show that sonogenetics could modulate the behaviour of freely moving mice.

In 2022, she and members of her lab designed and 3D-printed a flexible and versatile tool known as an Airy beam-enabled binary acoustic metasurface that allowed them to manipulate ultrasound beams. She also is developing Sonogenetics 2.0, which combines the advantage of ultrasound and genetic engineering to modulate defined neurons noninvasively and precisely in the brains of humans and animals. AhSonogenetics brings them together as a potential method to intervene in neurodegenerative diseases. 

“By enabling precise and flexible cell-type-specific neuromodulation without invasive procedures, AhSonogenetics provides a powerful tool for investigating intact neural circuits and offers promising interventions for neurological disorders,” Chen said. 

Sonogenetics gives researchers a way to precisely control the brains, while airy-beam technology allows researchers to bend or steer the sound waves to generate arbitrary beam patterns inside the brain with a high spatial resolution. Yaoheng (Mack) Yang, a postdoctoral research associate who earned a doctorate in biomedical engineering from McKelvey Engineering in 2022, said the technology gives the researchers three unique advantages.

“Airy beam is the technology that can give us precise targeting of a smaller region than conventional technology, the flexibility to steer to the targeted brain regions, and to target multiple brain regions simultaneously,” Yang said.

Chen and her team, including first authors Zhongtao Hu, a former postdoctoral research associate, and Yang, designed each Airy-beam metasurface individually as the foundation for wearable ultrasound devices that were tailored for different applications and for precise locations in the brain.

Chen’s team tested the technique on a mouse model of Parkinson’s disease. With AhSonogenetics, they were able to stimulate two brain regions simultaneously in a single mouse, eliminating the need for multiple implants or interventions. This stimulation alleviated Parkinson’s-related motor deficits in the mouse model, including slow movements, difficulty walking and freezing behaviours.

The team’s Airy-beam device overcomes some of the limits of sonogenetics, including tailoring the design of the device to target specific brain locations, as well as incorporating the flexibility to adjust target locations in a single brain.

Hu said the device, which costs roughly $50 to make, can be tailored in size to fit various brain sizes, expanding its potential applications. 

“This technology can be used as a research platform to speed neuroscience research because of the capability to flexibly target different brain regions,” Hu said. “The affordability and ease of fabrication lower the barriers to the widespread adoption of our proposed devices by the research community for neuromodulation applications.”

Source: Washington University in St. Louis

Wood May Have Natural Antiviral Properties

Photo by National Cancer Institute on Unsplash

Thinking about getting a new desk for your practice? That might be a good idea. Viruses, including SARS-CoV-2, can get passed from person to person via contaminated surfaces. But can some surfaces reduce the risk of this type of transmission without the help of household disinfectants? As reported in ACS Applied Materials & Interfaces, wood has natural antiviral properties that can reduce the time viruses persist on its surface – and some species of wood are more effective than others at reducing infectivity.

Enveloped viruses, like the coronavirus, can live up to five days on surfaces; nonenveloped viruses, including enteroviruses linked to the common cold, can live for weeks, in some cases even if the surfaces are disinfected. Previous studies have shown that wood has antibacterial and antifungal properties, making it an ideal material for cutting boards. But wood’s ability to inactivate viruses has yet to be explored, which is what Varpu Marjomäki and colleagues set out to study.

The researchers looked at how long enveloped and nonenveloped viruses remained infectious on the surface of six types of wood: Scots pine, silver birch, gray alder, eucalyptus, pedunculate oak and Norway spruce. To determine viral activity, they flushed a wood sample’s surface with a liquid solution at different time points and then placed that solution in a petri dish that contained cultured cells. After incubating the cells with the solution, they measured the number (if any) infected with the virus.

Results from their demonstrations with an enveloped coronavirus showed that pine, spruce, birch and alder need one hour to completely reduce the virus’ ability to infect cells, with eucalyptus and oak needing two hours. Pine had the fastest onset of antiviral activity, beginning after five minutes. Spruce came in second, showing a sharp drop in infectivity after 10 minutes.

For a nonenveloped enterovirus, the researchers found that incubation on oak and spruce surfaces resulted in a loss of infectivity within about an hour, with oak having an onset time of 7.5 minutes and spruce after 60 minutes. Pine, birch and eucalyptus reduced the virus’ infectivity after four hours, and alder showed no antiviral effect.

Based on their study data, the researchers concluded that the chemical composition of a wood’s surface is primarily responsible for its antiviral functionality. While determining the exact chemical mechanisms responsible for viral inactivation will require further study, they say these findings point to wood as a promising potential candidate for sustainable, natural antiviral materials.

Source: American Chemical Society

Popular OTC Supplement Improves Walking in Peripheral Artery Disease

Photo by Miikka Luotio on Unsplash

The over-the-counter supplement nicotinamide riboside, a form of vitamin B3, increased the walking endurance of patients with peripheral artery disease, a chronic leg condition for which there are few effective treatments. 

In a preliminary, randomised, double-blind clinical trial led by Northwestern University and University of Florida scientists, patients who took nicotinamide riboside daily for six months increased their timed walking distance by more than 17.3m, compared to a placebo group. As expected, walking speed declined in the placebo group, because peripheral artery disease causes progressive declines in walking performance. 

“This is a signal that nicotinamide riboside could help these patients,” said Christiaan Leeuwenburgh, PhD, a UF professor of physiology and aging and senior author of the clinical trial report. “We are hoping to conduct a larger follow-up trial to verify our findings.”

Along with other researchers, Leeuwenburgh, whose research specialises in anti-aging treatments, collaborated with Mary M. McDermott, MD, a physician and professor of medicine at Northwestern University and an expert in peripheral artery disease.

The scientists recruited 90 people with an average age of 71 who had peripheral artery disease, or PAD, to test the effects of nicotinamide riboside. The supplement is increasingly popular as an anti-aging treatment (sales exceeded $60 million in 2022 in the US alone) but there has been scant evidence of any benefit in healthy people. Nicotinamide riboside is a precursor for the essential compound NAD, which plays roles in the body related to energy generation, improved blood flow and DNA repair.

Because PAD is associated with problems generating energy within muscle cells, McDermott and Leeuwenburgh thought that nicotinamide riboside, by improving energy generation, could help improve walking in people with the disease.

And indeed that’s what they found. Participants taking the supplement walked an average of 7m more in a six-minute walking test after six months, while those taking a placebo walked 10.3m less. Those who took at least 75% of the pills they were supposed to take performed even better, adding more than 30m to their walking distance, compared to people who took a placebo.

(The researchers also tested if resveratrol, a compound best known for being in red wine, could boost the effects of nicotinamide riboside; they found no additional benefits.)

PAD affects more than 8.5 million Americans over the age of 40. Caused by the buildup of fatty deposits in arteries, and associated with diabetes and smoking, the disease reduces blood flow to the limbs, especially the legs. Walking often becomes painful, and the disease typically causes declines in walking ability over time. Supervised walking exercise is first line therapy for PAD, but most people with the condition do not have access to supervised exercise. 

In addition to a larger trial focused on patients suffering from PAD, Leeuwenburgh hopes to test the effects of nicotinamide riboside on walking performance in healthy older adults. 

“We need to test it on a healthy older population before we recommend healthy people take it,” he said.

Source: University of Florida

Genetic Study of Coffee’s Mental Health Links has Contradictory Results

Photo by Mike Kenneally on Unsplash

Coffee drinking is a heritable habit, and one that carries a certain amount of genetic baggage. Caffeinated coffee is a psychoactive substance, notes Sandra Sanchez-Roige, PhD, an associate professor at University of California San Diego. She is the corresponding author of a study published in the journal Neuropsychopharmacology that compared coffee-consumption characteristics from a 23andMe database in the United States with the UK Biobank.

Lead author Hayley H. A. Thorpe, PhD, at Western University in Ontario, explained that the team collected genetic data as well as self-reported coffee-consumption numbers to assemble a genome-wide association study (GWAS). The idea was to make connections between the genes that were known to be associated with coffee consumption and the traits or conditions related to health.

“We used this data to identify regions on the genome associated with whether somebody is more or less likely to consume coffee,” Thorpe explained. “And then identify the genes and biology that could underlie coffee intake.”

UC San Diego professor Abraham Palmer, PhD is also a lead researcher on the paper. He said that most people are surprised that there is a genetic influence on coffee consumption. “We had good reason to suspect from earlier papers that there were genes that influence how much coffee someone consumes,” he said. “And so, we weren’t surprised to find that in both of the cohorts we examined there was statistical evidence that this is a heritable trait. In other words, the particular gene variants that you inherit from your parents influence how much coffee you’re likely to consume.”

Sanchez-Roige said the genetic influence on coffee consumption was the first of two questions the researchers wanted to address.

“The second is something that coffee lovers are really keen on learning,” Sanchez-Roige said. “Is drinking coffee good or bad? Is it associated with positive health outcomes or not?”

The answer is not definitive. The group’s genome-wide association study of 130 153 U.S.-based 23andMe research participants was compared with a similar UK Biobank database of 334 649 Britons, revealing consistent positive genetic associations between coffee and harmful health outcomes such as obesity and substance use. A positive genetic association is a connection between a specific gene variant (the genotype) and a specific condition (the phenotype). Conversely, a negative genetic association is an apparent protective quality discouraging the development of a condition. The findings get more complicated when it comes to psychiatric conditions.

“Look at the genetics of anxiety, for instance, or bipolar and depression: In the 23andMe data set, they tend to be positively genetically correlated with coffee intake genetics,” Thorpe said. “But then, in the UK Biobank, you see the opposite pattern, where they’re negatively genetically correlated. This is not what we expected.”

She said there were other instances in which the 23andMe set didn’t align with the UK Biobank, but the greatest disagreement was in psychiatric conditions.

“It’s common to combine similar datasets in this field to increase study power. This information paints a fairly clear picture that combining these two datasets was really not a wise idea. And we didn’t end up doing that,” Thorpe said. She explained that melding the databases might mask effects, leading researchers toward incorrect conclusions – or even cancelling each other out.

Sanchez-Roige says the researchers have some ideas about how the differences in results arose. To begin with, there was an apples-and-oranges aspect to the surveys. For instance, the 23andMe survey asked, “How many 5-ounce (cup-sized) servings of caffeinated coffee do you consume each day?” Compare it to the UK Biobank’s “How many cups of coffee do you drink each day? (Include decaffeinated coffee)”

Beyond serving size and the caffeinated/decaf divide, the surveys made no accommodation for the various ways coffee is served. “We know that in the U.K., they have generally higher preference for instant coffee, whereas ground coffee is more preferred in the U.S.,” Thorpe said.

“And then there’s the frappuccinos,” Sanchez-Roige added, citing the American trend of taking coffee loaded with sugary additives. Palmer mentioned other caffeinated drinks and especially in the context of the UK Biobank, tea, none of which were included in the GWAS, which addressed only coffee. Palmer added that the GWAS demonstrates the relationship between genotype and phenotype is more different than the relationship between coffee and tea.

“Genetics influences lots of things. For instance, it influences how tall you might be,” he said. “And those kinds of things probably would play out very similarly, whether you lived in the US or the UK But coffee is a decision that people make.”

Sanchez-Roige pointed out that coffee comes in a variety of forms, from instant to frappuccino, and is consumed amid cultural norms that differ from place to place. A person with a given genotype might end up having quite a different phenotype living in the UK versus the US.

“And that’s really what the data are telling us,” she said. “Because unlike height, where your behaviour doesn’t really have much to do with it, your behaviour and the choices you’re making in your environment play out in various ways. So the interaction between genotype and environment complicates the picture.”

The collaborators stressed the need for more investigation to unravel the relationships between genetics and the environment, focusing not only on coffee/caffeine intake but also other substance-use issues.

Source: University of California San Diego

Anaemia Reduction Efforts to Improve School Attendance may be Ineffective

Photo by Mary Taylor on Pexels

In low- and middle-income countries, anaemia reduction efforts are often touted as a way to improve educational outcomes and reduce poverty. A new study, published in Communications Medicine, evaluates the relationship between anaemia and school attendance in India, debunking earlier research that could have misguided policy interventions.

Kumar’s research explores the intersection of global health and poverty reduction. His latest work evaluates the relationship between anaemia and school attendance in India.

The study investigated whether there was a link between anaemia and school attendance in more than 250 000 adolescents ages 15 to 18. Earlier observational studies have shown a link between anaemia and attendance, even after accounting for variables such as gender and household wealth, according to Kumar. But the new study, which applied more rigorous econometric statistical analysis, did not find such a link, he said.

“Most previous research on this topic has used conventional study designs or focused on small geographical areas, which limits its policy relevance,” said study co-author Santosh Kumar, associate professor of development and global health economics at the University of Notre Dame, is co-author of the study. “Earlier estimates may have been distorted by unobserved household factors related to both anaemia and school attendance. So in this study, we focused on the relationship between anaemia and attendance among adolescents who were living in the same household.

“Ultimately,” Kumar said, “we found that the link between anaemia and schooling is more muted than previously suggested by studies that did not consider household-level factors.”

The findings have important implications for policymakers seeking to improve education in low- and middle-income countries like India, Kumar said. India has widespread school attendance issues and struggles with health conditions such as anaemia caused by iron deficiency, particularly in children and adolescents. The country has pushed to improve educational outcomes, in keeping with the United Nations’ Sustainable Development Goals, Kumar said. But to achieve that, he said, more research is needed to pinpoint an evidence-based intervention.

The latest study builds on an earlier one in which Kumar and fellow researchers helped evaluate the results of an iron fortification school lunch program for students ages 7 and 8 in India. That study showed that fortification reduced anaemia but did not affect students’ performance in school. A forthcoming study, set to launch in summer 2024, will look at iron fortification for children ages 3 to 5. The research hypothesis is that an early-age nutritional intervention among preschoolers would make a significant impact on physical and cognitive development.

“Our findings have implications for policymakers who want to improve educational outcomes and reduce poverty,” Kumar said. “Effective policies are based on evidence. We need more rigorous statistical analysis to examine the causal relationship between anaemia and education.

“This work ties into my larger research agenda, which explores the intersection of global health and poverty reduction. I want to use my academic research to support human dignity by helping to identify evidence-based health policies that will make a tangible difference in people’s lives.”

Source: University of Notre Dame