Scientists Trace the Neural Pathway of the Placebo Effect

Photo by Danilo Alvesd on Unsplash

The placebo effect is very real. This we’ve known for decades, as seen in real-life observations and the best double-blinded randomised clinical trials researchers have devised for many diseases and conditions, especially pain. And yet, how and why the placebo effect occurs has remained a mystery. Now, neuroscientists have discovered a key piece of the placebo effect puzzle, reporting it in Nature.

Researchers at the University of North Carolina School of Medicine – with colleagues from Stanford, the Howard Hughes Medical Institute, and the Allen Institute for Brain Science – discovered a pain control pathway that links the cingulate cortex in the front of the brain, through the pons region of the brainstem, to cerebellum in the back of the brain.

The researchers, led by Greg Scherrer, PharmD, PhD, associate professor in the UNC Department of Cell Biology and Physiology, the UNC Neuroscience Center, and the UNC Department of Pharmacology, then showed that certain neurons and synapses along this pathway are highly activated when mice expect pain relief and experience pain relief, even when there is no medication involved.

“That neurons in our cerebral cortex communicate with the pons and cerebellum to adjust pain thresholds based on our expectations is both completely unexpected, given our previous understanding of the pain circuitry, and incredibly exciting,” said Scherrer. “Our results do open the possibility of activating this pathway through other therapeutic means, such as drugs or neurostimulation methods to treat pain.”

Scherrer and colleagues said research provides a new framework for investigating the brain pathways underlying other mind-body interactions and placebo effects beyond the ones involved in pain.

The Placebo Paradox

In conjunction with millennia of evolution, our brains can search for ways to alleviate the sensation of pain, in some cases quantifiably as with released chemicals, and less quantifiably through positive thinking and even prayer which have some documented benefit. And then there is the placebo effect.

In clinical research, the placebo effect is often seen in the “sham” treatment group that receives a fake pill or intervention that is supposed to be inert; no benefit is expected. Except that the brain is so powerful and individuals so desire to feel better that some experience a marked improvement in their symptoms. Some placebo effects are so strong that individuals are convinced they received a real treatment meant to help them.

In fact, it’s thought that some individuals in the “actual” treatment group also derive benefit from the placebo effect, complicating experimental design and driving larger sample sizes. One way to help scientists account for this is to first understand what precisely is happening in the brain of someone experiencing the placebo effect.

Enter the Scherrer lab

The scientific community’s understanding of the biological underpinnings of pain relief through placebo analgesia came from human brain imaging studies, which showed activity in certain brain regions. Those imaging studies did not have enough precision to show what was actually happening in those brain regions. So Scherrer’s team designed a set of meticulous, complementary, and time-consuming experiments to learn in more detail, with single nerve cell precision, what was happening in those regions.

First, the researchers created an assay that generates in mice the expectation of pain relief and then very real placebo effect of pain relief. Then the researchers used a series of experimental methods to study the intricacies of the anterior cingulate cortex (ACC), which had been previously associated with the pain placebo effect. While mice were experiencing the effect, the scientists used genetic tagging of neurons in the ACC, imaging of calcium in neurons of freely behaving mice, single-cell RNA sequencing techniques, electrophysiological recordings, and optogenetics – the use of light and fluorescent-tagged genes to manipulate cells.

These experiments helped them see and study the intricate neurobiology of the placebo effect down to the brain circuits, neurons, and synapses throughout the brain.

The scientists found that when mice expected pain relief, the rostral anterior cingulate cortex neurons projected their signals to the pontine nucleus, which had no previously established function in pain or pain relief. And they found that expectation of pain relief boosted signals along this pathway.

“There is an extraordinary abundance of opioid receptors here, supporting a role in pain modulation,” Scherrer said. “When we inhibited activity in this pathway, we realised we were disrupting placebo analgesia and decreasing pain thresholds. And then, in the absence of placebo conditioning, when we activated this pathway, we caused pain relief.

Lastly, the scientists found that Purkinje cells – a distinct class of large branch-like cells of the cerebellum – showed activity patterns similar to those of the ACC neurons during pain relief expectation. Scherrer and first author Chong Chen, MD, PhD, a postdoctoral research associate in the Scherrer lab, said that this is cellular-level evidence for the cerebellum’s role in cognitive pain modulation.

“We all know we need better ways to treat chronic pain, particularly treatments without harmful side effects and addictive properties,” Scherrer said. “We think our findings open the door to targeting this novel neural pain pathway to treat people in a different but potentially more effective way.”

Source: University of North Carolina Health Care

A New Genetic Culprit in Huntington’s Disease

Photo by Sangharsh Lohakare on Unsplash

Researchers in Berlin and Düsseldorf have implicated a new gene in the progression of Huntington’s disease in a brain organoid model. The gene may contribute to brain abnormalities much earlier than previously thought. The study is out now in Nature Communications.

The researchers are the first to implicate the gene CHCHD2 in Huntington’s disease (HD) – an incurable genetic neurodegenerative disorder – and identified the gene as a potentially new therapeutic target. In a brain organoid model of the disease, the researchers found that mutations in the Huntington gene HTT also affect CHCHD2, which is involved in maintaining the normal function of mitochondria.

Six different labs at the Max Delbrück Center participated in the study, led by Dr Jakob Metzger of the “Quantitative Stem Cell Biology” lab at the and the “Stem Cell Metabolism” lab of Professor Alessandro Prigione at Heinrich Heine University Düsseldorf (HHU). Each lab contributed their unique expertise on Huntington’s disease, brain organoids, stem cell research and genome editing. “We were surprised to find that Huntington’s disease can impair early brain development through defects associated with mitochondrial dysfunction,” says Dr Pawel Lisowski, co-lead author in the Metzger lab at the Max Delbrück Center.

Moreover, “the organoid model suggests that HTT mutations damage brain development even before clinical symptoms appear, highlighting the importance of detecting the late-onset neurodegenerative disease early,” Selene Lickfett, co-lead author and a doctoral student in the Faculty of Mathematics and Natural Science in the lab of Prigione at HHU adds.

The unusual repetition of three letters

Huntington’s disease is caused when the nucleotides Cytosine, Adenine and Guanine are repeated an excessive number of times in the in the Huntington gene HTT. People with 35 or less repeats are generally not at risk of developing the disease, while carrying 36 or more repeats has been associated with disease. The greater the number of repeats, the earlier the disease symptoms are likely to appear, explains Metzger, a senior author of the study. The mutations cause nerve cells in the brain to progressively die. Those affected, steadily lose muscle control and develop psychiatric symptoms such as impulsiveness, delusions and hallucinations. Huntington’s disease affects approximately five to 10 in every 100 000 people worldwide. Existing therapies only treat the symptoms of the disease, they don’t slow its progression or cure it.

The challenge of HTT gene editing

To study how mutations in the HTT gene affect early brain development, Lisowski, first used variants of the Cas9 gene editing technology and manipulation of DNA repair pathways to modify healthy induced pluripotent stem cells such that they carry a large number of CAG repeats. This was technically challenging because gene editing tools are not efficient in gene regions that contain sequence repeats, such as the CAG repeats in HTT, says Lisowski.

The genetically modified stem cells were then grown into brain organoids – three-dimensional structures a few millimetres in size that resemble early-stage human brains. When the researchers analysed gene expression profiles of the organoids at different stages of development, they noticed that the CHCHD2 gene was consistently under expressed, which reduced metabolism of neuronal cells. CHCHD2 is involved in ensuring the health of mitochondria – the energy producing structures in cells. CHCHD2 has been implicated in Parkinson’s disease, but never before in Huntington’s.

They also found that when they restored the function of the CHCHD2 gene, they could reverse the effect on neuronal cells. “That was surprising,” says Selene Lickfett. “It suggests in principle that this gene could be a target for future therapies.”

Moreover, defects in neural progenitor cells and brain organoids occurred before potentially toxic aggregates of mutated Huntingtin protein had developed, adds Metzger, indicating that disease pathology in the brain may begin long before it is clinically evident.

“The prevalent view is that the disease progresses as a degeneration of mature neurons,” says Prigione. “But if changes in the brain already develop early in life, then therapeutic strategies may have to focus on much earlier time-points.”

Wide reaching implications

“Our genome editing strategies, in particular the removal of the CAG repeat region in the Huntington gene, showed great promise in reversing some of observed developmental defects. This suggests a potential gene therapy approach,” says Prigione. Another potential approach could be therapies to increase CHCHD2 gene expression, he adds.

The findings may also have broader applications for other neurodegenerative diseases, Prigione adds. “Early treatments that reverse the mitochondrial phenotypes shown here could be a promising avenue for counteracting age-related diseases like Huntington’s disease.”

Source: Max Delbrück Center for Molecular Medicine in the Helmholtz Association

Alzheimer’s Drug may Slow Cognitive Decline in Dementia with Lewy Bodies

Created with Gencraft. CC4.0

Dementia with Lewy bodies is a type of dementia that is similar to both Alzheimer’s disease and Parkinson’s disease but studies on long-term treatments are lacking. A new study from Karolinska Institutet, published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association, highlights the potential cognitive benefits of cholinesterase inhibitor treatment.

Lewy body disease, which includes dementia with Lewy bodies (DLB) and Parkinson’s disease with and without dementia, is the second most common neurodegenerative disorder, following Alzheimer’s disease. 

DLB accounts for approximately 10–15% of dementia cases and is characterised by changes in sleep, behaviour, cognition, movement, and regulation of automatic bodily functions. 

“There are currently no approved treatments for DLB, so doctors often use drugs for Alzheimer’s disease, such as cholinesterase inhibitors and memantine, for symptom relief,” says Hong Xu, assistant professor at the Department of Neurobiology, Care Sciences and Society, Karolinska Institutet and first author of the paper. “However, the effectiveness of these treatments remains uncertain due to inconsistent trial results and limited long-term data.” 

In the current study, researchers have examined the long-term effects of cholinesterase inhibitors (ChEIs) and memantine compared with no treatment for up to ten years in 1,095 patients with DLB.

Slower cognitive decline

They found that ChEIs may slow down cognitive decline over five years compared to memantine or no treatment. ChEIs were also associated with a reduced risk of death in the first year after diagnosis. 

“Our results highlight the potential benefits of ChEIs for patients with DLB and support updating treatment guidelines,” says Maria Eriksdotter, professor at the Department of Neurobiology, Care Sciences and Society, Karolinska Institutet and last author of the paper.  

Due to the study’s observational nature, no conclusions can be drawn about causality. The researchers did not have data on patient lifestyle habits, frailty, blood pressure, and Alzheimer’s disease co-pathology, which may have influenced the findings. Another limitation of the study is that it remains challenging to diagnose DLB accurately. 

Source: Karolinska Institutet

Snake Antivenom Mired by Shortages and Side-effects – Could a New Treatment Boost Our Options?

By Jesse Copelyn

The only effective treatment for severe snakebite envenomation from a potentially deadly snake is antivenom. (Picture: Johan Marais, African Snakebite Institute)

In recent years, shortages of snake antivenom have plagued South Africa and much of the globe. Even when antivenom is available, potentially serious side effects often limit its use. Jesse Copelyn unpacks the fascinating details behind the antivenom products that might save your life and takes a look at a promising experimental treatment.

Every day, somewhere between 220 and 380 people die from snakebite around the world, yet according to Doctors Without Borders (MSF) the problem remains “chronically underfunded and neglected”.

MSF’s senior advisor on neglected tropical diseases, Koert Ritmeijer, tells Spotlight that in 2019 the World Health Organization (WHO) committed to halve the number of snakebite deaths by 2030. But this hasn’t been followed by any significant support from donor countries or philanthropic foundations, he says, with programmes that aim to “increase patients’ access to antivenoms [remaining] very underfunded”.

Antivenom is the primary registered class of treatments for snakebite. Long-running global shortages of the treatment continue to leave patients in poorer parts of the world without the care they need. That’s in part because pharmaceutical companies haven’t always found it profitable to produce. The treatment takes a long time to manufacture and it has to be geared to a specific or small group of snake species. The clientele are often those that are least able to pay, namely Africa and Asia’s rural poor.

With a limited number of suppliers, and thus a lack of market competition, prices remain high, making it difficult for many African governments to import antivenom without donor assistance.

South Africa has long been an exception. It’s the only country in Sub-Saharan Africa that produces its own antivenom – which is internationally recognised and outperforms several comparable products when studied in mice. Additionally, research done in the Western Cape shows that hospital pharmacies have traditionally had stock on hand.

However, last year production delays at the country’s state-run manufacturer, the South African Vaccine Producers (SAVP), led to a nation-wide shortage of antivenom, leaving many snakebite victims with limited options. The locally made SAVP products (previously the SAIMR) have historically been the main source of antivenom in the country.

While the nation-wide stockouts were reportedly resolved within a few months, this appears to have been more than a one-time blip. Even before the national shortage made news headlines last year, neighbouring countries that rely on South African antivenom were struggling to procure enough of the product. For instance, a 2022 study states that supply to Eswatini was becoming “increasingly and disturbingly intermittent” and that a charitable foundation there had “been unable to secure a supply of this antivenom for several months”.

Meanwhile, CEO of the African Snakebite Institute, Johan Marais told Spotlight that vets in South Africa have been struggling to access sufficient supplies for the last three years (the same SAVP products that are used to treat humans are sometimes used on pet dogs that get bitten). He said stocks in the country were low heading into snakebite season – spring and summer.

Southern Africa has 176 different types of snakes. (Infograph: African Snakebite Institute)

Compounding the problem is that even at the best of times, people in poor rural areas often struggle to access antivenom shortly after being bitten (even though the issue can be extremely time-sensitive). That’s because patients can’t simply get the drug at a local pharmacy or out of their medicine cabinet. Instead, they need to wait until they arrive at an intensive care unit before doctors can assess whether they require the treatment – and if they do, it has to be administered intravenously under carefully monitored conditions.

This is because antivenom comes with a range of side-effects. For instance, research at a Kwazulu-Natal hospital found that over three in five patients had some adverse reaction to the antidote, with nearly half of all patients going into anaphylactic shock, a severe allergic reaction that causes a person’s blood pressure to drop and makes it difficult to breathe. Consequently, health workers need to be on standby with adrenaline.

People who get bitten in rural areas far away from large healthcare facilities are thus often left in a precarious position despite being most at risk.

But why do antivenom products have so many side-effects in the first place? And why are they so difficult and time-consuming to manufacture? The answer has to do with the archaic way that antivenom is made.

A life-saving drug made of ‘horse junk’

In South Africa, the SAVP, which is a subsidiary of the National Health Laboratory Services (NHLS), makes antivenom by injecting small amounts of snake venom into a horse, so that the animal’s immune system can learn to recognise and combat the toxins. This is done repeatedly over a period of nine months until the horse becomes hyperimmunised, meaning its body produces massive numbers of antibodies which target the venom.

NHLS spokesperson, Mzi Gcukumana, says that once this happens, the horse’s “plasma is collected” (this is the liquid part of blood that contains the antibodies), and is then “carefully filtered in a sterile environment”.

The result is an antivenom product which targets the snake species that was used on the horse in the first place. At present, the SAVP makes three antivenoms: one for boomslang bites, another which is used for the saw-scaled viper, and a third, which treats bites from 10 different snake species found across the country, including the puff adder and rinkhals (which are some of the most common culprits of snakebite in South Africa). This multi-species product, known as a polyvalent antivenom, is made by injecting the horse with venom from different snake species.

There’s nothing unusual about the SAVP’s method – it’s the way everyone has made commercial snake antivenoms since the late 19th century (in some countries sheep are used instead of horses) and while it’s effective, it is also well-understood why it often induces serious side-effects.

Dr Kurt Wibmer, a scientist who is researching a new snakebite treatment, explains that with antivenom “only 10 to 20 percent of the medicine you’re getting is specific to the venom, and the other 80% is junk horse protein that your body doesn’t [need].” The result is that by injecting antivenom “you’re putting a bunch of foreign substances into your body, that the body then recognises as ‘not you’, and it develops an immune response to that [which can sometimes be extreme]”.

To add to these problems, antivenom is expensive. According to price lists shared with Spotlight by staff at the Tygerberg Poison Information Centre, the SAVP’s polyvalent product is currently priced at R2 400 per vial, while the boomslang product is sold for R7 700. And since most of the vial is “junk horse protein”, snakebite victims require multiple hits to get enough of the active ingredient. This could be 6 vials or it could be over 20.

With a high price tag, a laborious production process, and a host of side-effects that prevent health workers from saving snakebite victims at primary healthcare facilities, new treatments are badly needed – either to replace traditional antivenom or to complement it. Fortunately, many are being designed to address exactly these problems.

Over a hundred years later, anti-inflammatory drug may expand treatment options

One promising treatment in development is a synthetic anti-inflammatory medicine called Varespladib. This drug was originally developed as a treatment for conditions like acute coronary syndrome. After these efforts were abandoned, scientists discovered that the product may be able to play a role in treating snakebite victims.

That’s because Varespladib works by inhibiting an enzyme called sPLA2. sPLA2 is a core component of the venom of roughly 95% of vipers and elapids (two prominent venomous snake families). In fact, the enzyme plays a key role in many of the most harmful effects of this venom, including its ability to damage body tissue, paralyse victims, and cause heavy bleeding.

It is early days. Until now, the only evidence that suggests that Varespladib can block these effects is from studies done on mice and in petri dishes. However, a phase 2 clinical trial on humans, which has yet to be published, was just completed in the United States and India. In this research, snakebite victims who arrived at hospitals were randomly split into two groups: one got the Varespladib alongside standard treatment (like antivenom), while the other group got a placebo, plus ordinary treatment.

The results from the trial are currently under peer-review, though preliminary findings have been presented at conferences. Dr Matthew R Lewin, a co-author of the study and founder of a public benefit company that is developing the drug, told Spotlight that it looked like Veraspladib may help snakebite patients if used immediately (and when used alongside traditional treatment): “In the [trial] we took patients as long as 10 hours [after they had been bitten]. Up to 5 hours, we saw promising outcomes [from those who got Varespladib]… after 5 hours, the benefit was not apparent with respect to the primary outcome of the study”.

Time will tell whether these results will be confirmed not only in the peer-review process, but also in larger clinical trials (the present study only aimed to enrol 94 participants). If successful, Varespladib could represent an important advance.

That’s because safety trials show that unlike antivenom, the synthetic drug does not appear to cause any major side-effects. It can also be taken in pill-form, rather than being injected. This means that while snakebite victims would still have to wait until they got to a hospital to take antivenom, they would at least have something which they could take right away. While the study only looked at the effects of varespladib in combination with antivenom, Lewin suggests that “it is reasonable to expect that there will be a range of salutary [health-giving] effects” from varespladib alone, since it blocks sPLA2. He notes, however, that more research is needed.

And Varespladib isn’t the only new treatment in development. Others include a chelating agent which targets a metal-based component of snake venom. Though the evidence for this is so far only from studies in mice.

Nonetheless, since our primary treatment option for snakebite remains similar to what it was over a century ago, researchers are hopeful that we might finally begin to take a few steps forward.

Note:  This is part 1 of a two-part Spotlight series on snakebite treatment in South Africa. In part 2 we will, among others, look at promising advances that may help reduce antivenom shortages.

Republished from Spotlight under a Creative Commons licence.

Read the original article

Even When Informed, Participants Found Placebos Reduced Anxiety

Even when participants knew they had placebos, their COVID anxiety reduced

Photo from Pixabay CCO

A study out of Michigan State University found that non-deceptive placebos, or placebos given with people fully knowing they are placebos, effectively manage stress – even when the placebos are administered remotely. 

Researchers recruited participants experiencing prolonged stress from the COVID pandemic for a two-week randomised controlled trial. Half of the participants were randomly assigned to a non-deceptive placebo group and the other half to the control group that took no pills. The participants interacted with a researcher online through four virtual sessions on Zoom. Those in the non-deceptive placebo group received information on the placebo effect and were sent placebo pills in the mail along with instructions on taking the pills. 

The study, which appears in Applied Psychology: Health and Well-Being, found that the non-deceptive group showed a significant decrease in stress, anxiety and depression in just two weeks compared to the no-treatment control group. Participants also reported that the non-deceptive placebos were easy to use, not burdensome and appropriate for the situation.

“Exposure to long-term stress can impair a person’s ability to manage emotions and cause significant mental health problems, so we’re excited to see that an intervention that takes minimal effort can still lead to significant benefits,” said Jason Moser, co-author of the study and professor in MSU’s Department of Psychology. “This minimal burden makes non-deceptive placebos an attractive intervention for those with significant stress, anxiety and depression.”

The researchers are particularly hopeful in the ability to remotely administer the non-deceptive placebos by health care providers.

“This ability to administer non-deceptive placebos remotely increases scalability potential dramatically,” said Darwin Guevarra, co-author of the study and postdoctoral scholar at the University of California, San Francisco, “Remotely administered non-deceptive placebos have the potential to help individuals struggling with mental health concerns who otherwise would not have access to traditional mental health services.”

Source: Michigan State University

Chronic Cough may be a Hereditary Condition

Photo by Towfiqu barbhuiya on Unsplash

Chronic cough is among the most common reasons for seeking medical care, with middle-aged women the group most affected. A pair of new studies published in ERJ Open Research and PLOS ONE suggest that this may be a hereditary condition.

“More than 10% of the population has a chronic cough, which has been shown to entail several negative consequences: reduced quality of life, reduced ability to work and voice problems. At present, we have insufficient knowledge about what causes coughing and how best to treat it,” notes Össur Ingi Emilsson, Docent in Lung, Allergy and Sleep Research at the Department of Medical Sciences at Uppsala University.

The two studies from the department have investigated both how cough is currently managed in Swedish healthcare, and whether chronic cough can be hereditary.

The PLOS ONE study, based on data from the Swedish healthcare register, showed that 1–2% of the entire Swedish population sought care for chronic cough between 2016 and 2018, usually in primary care. Of those who sought care, the majority appear to have had a long-standing cough. The prevalence is highest among women between the ages of 40 and 60, with around 21 000 women seeking treatment for cough in these three years.

“Women generally seem to have a slightly more sensitive cough reflex, so the threshold for abnormal coughing is lower in women than in men. For me, it was unexpected that only one to two percent of patients seek help for a troublesome cough when over ten percent are affected. This can be partly explained by the lack of effective treatments. There also appeared to be some differences in care between different parts of the country, suggesting that better guidelines are needed for investigating and treating chronic cough,” continues Emilsson.

The other study, in ERJ Open Research, has provided a clue as to why some individuals develop chronic cough. Cough appears to be a hereditary phenomenon. In a large population study in northern Europe of 7155 parents and their 8176 adult children aged 20 years and over, it was found that if one parent has had chronic dry cough, their offspring were over 50% more likely to have chronic dry cough. This link was independent of confounding factors such as asthma, biological sex and smoking.

“A similar relationship was seen for productive cough, but in those cases smoking had a greater impact on prevalence. These results suggest that there is a genetic link to chronic cough,” adds Emilsson.

The research team has already begun a treatment study into chronic cough. Based on these new findings, the group is now moving forward with studies on genetic variants in collaboration with the Icelandic company deCODE genetics, which analyses the human genome. The aim is to identify which genetic variants are linked to chronic cough.

“This could provide a better understanding of the occurrence of chronic cough, which may ultimately result in better treatments for this difficult-to-treat condition,” explains Emilsson.

Source: Uppsala University

More Protein and Fibre While Dropping Calories is Key for Weight Loss

Photo by Andres Ayrton on Pexels

Participants on a self-directed dietary education program who had the greatest success at losing weight across a 25-month period consumed greater amounts of protein and fibre, found a study published in Obesity Science and Practice. Personalisation and flexibility also were key in creating plans that dieters could adhere to over time. 

At the one-year mark, successful dieters (41% of participants) had lost 12.9% of their body weight, compared with the remainder of the study sample, who lost slightly more than 2% of their starting weight. 

The dieters were participants in the Individualised Diet Improvement Program, which uses data visualisation tools and intensive dietary education sessions to increase dieters’ knowledge of key nutrients, enabling them to create a personalised, safe and effective weight-loss plan, said Manabu T. Nakamura, a professor of nutrition at the University of Illinois Urbana-Champaign and the leader of the research.

“Flexibility and personalisation are key in creating programs that optimise dieters’ success at losing weight and keeping it off,” Nakamura said. “Sustainable dietary change, which varies from person to person, must be achieved to maintain a healthy weight. The iDip approach allows participants to experiment with various dietary iterations, and the knowledge and skills they develop while losing weight serve as the foundation for sustainable maintenance.”

The pillars of iDip are increasing protein and fibre consumption along with consuming 1500 calories or less daily. 

Based on the dietary guidelines issued by the Institutes of Medicine, the iDip team created a one-of-a-kind, two-dimensional quantitative data visualisation tool that plots foods’ protein and fibre densities per calorie and provides a target range for each meal. Starting with foods they habitually ate, the dieters created an individualised plan, increasing their protein intake to about 80g and their fibre intake to about 20g daily.

In tracking the participants’ eating habits and their weights with Wi-Fi enabled scales, the team found strong inverse correlations between the percentages of fibre and protein eaten and dieters’ weight loss.    

“The research strongly suggests that increasing protein and fibre intake while simultaneously reducing calories is required to optimise the safety and efficacy of weight loss diets,” said first author and U. of I. alumna Mindy H. Lee, a then-graduate student and registered dietitian-nutritionist for the iDip program. 

Nakamura said the preservation of lean mass is very important while losing weight, especially when using weight-loss drugs.

 “Recently, the popularity of injectable weight loss medications has been increasing,” Nakamura said. “However, using these medications when food intake is strongly limited will cause serious side effects of muscle and bone loss unless protein intake is increased during weight loss.”

A total of 22 people who enrolled in the program completed it, including nine men and 13 women. Most of the dieters were between the ages of 30–64. Participants reported they had made two or more prior attempts to lose weight. They also had a variety of comorbidities – 54% had high cholesterol, 50% had skeletal problems and 36% had hypertension and/or sleep apnoea. Additionally, the dieters reported diagnoses of diabetes, nonalcoholic fatty liver disease, cancer and depression, according to the study.

The seven dieters who reported they had been diagnosed with depression lost significantly less weight: about 2.4% of their starting weight compared with those without depression, who lost 8.39% of their initial weight. The team found that weight loss did not differ significantly among participants with other comorbidities, or between younger and older participants or between men and women.

Body composition analysis indicated that dieters maintained their lean body mass, losing an average of 7.1kg of fat mass and minimal muscle mass at the six-month interval. Among those who lost greater than 5% of their starting weight, 78% of the weight they lost was fat, according to the study.

Overall, the participants reduced their fat mass from an average of 42.6kg at the beginning of the program to 35.7kg at the 15-month mark. Likewise, the dieters reduced their waists by about 7cm at six months and by a total of 9cm at 15 months, the team found. 

In tracking dieters’ protein and fibre intake, the team found a strong correlation between protein and fibre consumption and weight loss at three months and 12 months.

“The strong correlation suggests that participants who were able to develop sustainable dietary changes within the first three months kept losing weight in the subsequent months, whereas those who had difficulty implementing sustainable dietary patterns early on rarely succeeded in changing their diet in the later months,” Nakamura said.

The team hypothesised that this correlation could also have been associated with some dieters’ early weight loss success, which may have bolstered their motivation and adherence to their program.

Source: University of Illinois at Urbana-Champaign

Study Shows that Probiotics in Pregnancy Benefit Mothers and Offspring

Photo by SHVETS production: https://www.pexels.com/photo/focused-pregnant-black-woman-taking-vitamins-on-couch-6991899/

Giving probiotics to pregnant mice can enhance both the immune system and behaviour of the mothers and their offspring, according to a new study led by The Ohio State University Wexner Medical Center and College of Medicine.

“These results suggest that certain probiotics given to mothers during pregnancy can improve their offsprings’ behaviour and may affect the metabolism of common amino acids in our diets. Probiotics may also help counteract the negative effects of prenatal stress,” said study senior author Tamar Gur, MD, PhD, at OSU. 

Study findings are published online in the journal Brain, Behavior, and Immunity

Many studies have attested to the benefits of probiotics, which are considered safe to take during pregnancy. Researchers led by first author Jeffrey Galley, PhD found that a specific probiotic, Bifidobacterium dentium, may change how the body processes certain amino acids, such as tryptophan. During pregnancy, tryptophan helps control inflammation and brain development. 

“We have strong evidence this specific probiotic helped reduce stress-related problems in both mothers and their offspring, including helping the babies gain weight and improving their social behaviour,” said Gur, who also is an associate professor of psychiatry, neuroscience and obstetrics and gynaecology at Ohio State. 

Gur’s research team has studied how prenatal stress can lead to abnormal brain development and behavioural changes in offspring. So far, they’ve found that stress is linked to changes in brain inflammation and amino acid metabolism, as well as long-term reductions in social behaviour and abnormal microbiomes in offspring.

This study enhances their understanding of how gut microbes and probiotics can influence amino acid metabolism and help with behaviour and immune issues related to prenatal stress. The study also highlights the many benefits of this specific probiotic, even without the presence of stress.

“Now, we aim to understand the mechanisms behind these changes and explore ways to prevent or treat these effects,” Gur said. “Since prenatal stress is common in many pregnancies, we want to develop methods to reduce its negative effects.”

Source: Ohio State University Wexner Medical Center

Over Half of Iron Deficiency Cases Unresolved After Three Years

Photo by National Cancer Institute on Unsplash

Over half of people with iron deficiency were found to still have low iron levels three years after diagnosis, and among patients whose condition was effectively treated within that timeframe, they faced longer-than-expected delays, pointing to substantial gaps in appropriate recognition and efficient treatment of the condition, according to a study published in Blood Advances.

Iron deficiency is common, and affecting up to 40% of adolescents and young women. Previous work reported that up to 70% of cases go undiagnosed in high-risk populations, such as those with bleeding disorders, issues with malabsorption, or women who menstruate.

“Iron deficiency is probably a bigger problem than we realise. I’ve seen a lot of cases where people don’t have anaemia, but they are walking around with very little to no iron in their body and it can have a big impact on how people feel in their day-to-day life,” said Jacob Cogan, MD, assistant professor of medicine at the University of Minnesota and the study’s lead author. “Iron deficiency can be challenging to diagnose, but it’s easy to treat. Our findings underscore the need for a more coordinated effort to recognise and treat iron deficiency to help improve quality of life.”

If untreated, low iron stores can lead to mood changes, fatigue, hair loss, exercise intolerance, and eventually anaemia. The condition is generally first treated with oral iron supplementation, and if low iron levels persist after a few months or the patient reports side effects, intravenous (IV) iron is started.

For this study, the researchers retrospectively analysed medical records from one of Minnesota’s largest health system database and identified 13 084 adults with a laboratory diagnosis of iron deficiency (defined as a ferritin value of 25ng/mL, with and without anaemia) between 2010 and 2020 who had available follow-up data for three years.

In the study, iron deficiency was d or less. Patients had to have at least two ferritin values – one initial value and at least one more within the three-year study period. Adequate treatment and resolution was defined as a subsequent ferritin value of at least 50ng/mL. Most patients received some form of treatment, consistent across sex.

Of the 13,084 patients included in the study, 5,485 (42%) patients had normal iron levels within three years of diagnosis, while 7,599 (58%) had persisting iron deficiency based on low ferritin levels. Only 7% of patients had their iron levels return back to normal within the first year of diagnosis.

Factors associated with a higher likelihood of getting iron levels back to normal included older age (age 60 and up), male sex, Medicare insurance, and treatment with IV iron alone. Additionally, compared with patients who were still iron deficient, those whose condition was resolved had more follow-up blood work to check ferritin values (six vs four ferritin tests). Of note, younger patients, females, and Black individuals were most likely to remain iron deficient or experience longer lags in getting their iron stores back to a healthy level.

Even among patients whose iron levels were restored to normal during the study duration, it took nearly two years (the median time to resolution was 1.9 years), which researchers say is longer than expected and signals missed opportunities to more effectively manage the condition. While there was no data to look at whether anaemia iron deficiency was more apt to be treated, Dr Cogan says it’s reasonable to think this might be the case as iron deficiency without anaemia is harder to recognise.

“Two years is too long and well beyond the timeframe within which iron deficiency should be able to be sufficiently treated and resolved [with oral or IV treatments],” said Dr Cogan. “The numbers are pretty striking and suggest a need to put systems in place to better identify patients and treat them more efficiently.”

As with trends showing persisting iron deficiency, Dr Cogan attributes the delays in resolution to the diagnosis either being missed or not treated to resolution. He added that there is a clear need for education about non-anaemic iron deficiency and who is at high risk, more universal agreement on the best ferritin cut off for diagnosis, and efforts to create an iron deficiency clinic or pathway to “assess and treat patients more efficiently and get people feeling better faster.”

The study was limited by its reliance on EMR data and retrospective nature, which prevented researchers from determining why ferritin tests were ordered for patients or the cause of their iron deficiency.

Source: American Society of Hematology

Study Shows Fewer Kidney Stones with Higher Doses of Thiazides

Human kidney. Credit: Scientific Animations CC0

Higher thiazide doses are associated with greater reductions in urine calcium, which in turn correlate with fewer symptomatic kidney stone events, according to a Vanderbilt University Medical Center study out now in JAMA Network Open.  

Thiazide diuretics, commonly prescribed to prevent kidney stone recurrence, are drugs that act directly on the kidneys to promote diuresis by inhibiting the sodium/chloride cotransporter located in the distal convoluted tubule of a nephron. Thiazides are also used as a common treatment for high blood pressure and to clear fluid from the body in conditions such as heart failure. 

First author Ryan Hsi, MD, FACS, associate professor in the Department of Urology at VUMC, said the study data help explain the findings of the multicentre Hydrochlorothiazide for Kidney Stone Recurrence Prevention (NOSTONE) trial, which reported that hydrochlorothiazide did not reduce recurrence of kidney stone events.  

“In light of our research, the calcium reductions in that study were modest and likely insufficient to affect recurrence risk,” Hsi said.   

“What this means for patients is that thiazides remain an important option in the toolkit for preventing kidney stone recurrence. It may be beneficial to monitor calcium excretion while on thiazide therapy to adjust dose and diet to attain an adequate reduction in urine calcium.” 

A total of 634 participants were studied, revealing significant associations between higher thiazide doses and urine calcium reductions greater than those achieved in the NOSTONE trial, where participants took different doses of hydrochlorothiazide.  

For next steps, the researchers are interested in understanding which subtypes of thiazides and their dosing work best, and how best to optimise medication adherence, since these therapies are often administered long term.

Source: Vanderbilt University Medical Center