Author: ModernMedia

In Vitro Experiment Explains Why Humans Have Full Colour Vision and Dogs Don’t

Photo by Victor Freitas on Pexels

With human retinas grown in a petri dish, researchers discovered how retinoic acid, a metabolite of vitamin A, generates the specialised cells that enable people to see millions of colours, an ability that dogs, cats, and most other mammals do not have.

“These retinal organoids allowed us for the first time to study this very human-specific trait,” said author Robert Johnston, an associate professor of biology. “It’s a huge question about what makes us human, what makes us different.”

The findings, published in PLOS Biology, increase understanding of colour blindness, age-related vision loss, and other diseases linked to photoreceptor cells. They also demonstrate how genes instruct the human retina to make specific colour-sensing cells, a process scientists thought was controlled by thyroid hormones.

By tweaking the cellular properties of the organoids, the research team found that a vitamin A1 metabolite, retinoic acid, determines whether a cone will specialise in sensing red or green light.

Only humans with normal vision and closely related primates develop the red sensor.

For decades, it was that thought red cones formed through a coin toss mechanism where the cells haphazardly commit to sensing green or red wavelengths – and research from Johnston’s team recently hinted that the process could be controlled by thyroid hormone levels.

Instead, the new research suggests red cones materialise through a specific sequence of events orchestrated by retinoic acid within the eye.

The team found that high levels of retinoic acid in early development of the organoids correlated with higher ratios of green cones. Similarly, low levels of the acid changed the retina’s genetic instructions and generated red cones later in development.

“There still might be some randomness to it, but our big finding is that you make retinoic acid early in development,” Johnston said.

“This timing really matters for learning and understanding how these cone cells are made.”

Green and red cone cells are remarkably similar except for a protein called opsin, which detects light and tells the brain what colors people see.

Different opsins determine whether a cone will become a green or a red sensor, though the genes of each sensor remain 96% identical.

With a breakthrough technique that spotted those subtle genetic differences in the organoids, the team tracked cone ratio changes over 200 days.

“Because we can control in organoids the population of green and red cells, we can kind of push the pool to be more green or more red,” said author Sarah Hadyniak, who conducted the research as a doctoral student in Johnston’s lab and is now at Duke University.

“That has implications for figuring out exactly how retinoic acid is acting on genes.”

The researchers also mapped the widely varying ratios of these cells in the retinas of 700 adults.

Seeing how the green and red cone proportions changed in humans was one of the most surprising findings of the new research, Hadyniak said. Scientists still don’t fully understand how the ratio of green and red cones can vary so greatly without affecting someone’s vision.

If these types of cells determined the length of a human arm, the different ratios would produce “amazingly different” arm lengths, Johnston said.

To build understanding of diseases like macular degeneration, which causes loss of light-sensing cells near the center of the retina, the researchers are working with other Johns Hopkins labs.

The goal is to deepen their understanding of how cones and other cells link to the nervous system.

“The future hope is to help people with these vision problems,” Johnston said.

“It’s going to be a little while before that happens, but just knowing that we can make these different cell types is very, very promising.”

Source: Johns Hopkins University

In Type 2 Diabetics, Toxic Lipids and a Beneficial One Surge at Certain Times

Credit: Cell Reports Medicine (2023).

While sugar is most frequently blamed in the development of type 2 diabetes, a better understanding of the role of fats is also essential. By analysing the blood profiles of dozens of people suffering from diabetes or pre-diabetes, or who have had their pancreas partially removed, researchers at the University of Geneva (UNIGE) and Geneva University Hospitals (HUG) have made two major discoveries.

Firstly, the lipid composition of blood and adipose tissues fluctuates during the day, and is altered in a day-time dependent manner in diabetics, who have higher levels of toxic lipids. In addition, one type of lipid, lysoPI, is capable of boosting insulin secretion when the beta cells that normally produce it fail. These results, published in the journals Cell Reports Medicine and Diabetes, may have important implications for the treatment of diabetic patients.

The role of lipids in the physiological and pathological processes of human metabolism is gradually becoming clearer, particularly in type 2 diabetes, one of the most widespread serious metabolic disorders. Thanks to cutting-edge tools, in particular mass spectrometry, researchers are now able to simultaneously measure the levels of several hundred different types of lipids, each with its own specific characteristics and beneficial or harmful effects on our metabolism.

‘‘Identifying which lipids are most present in type 2 diabetics could provide a basis for a wide range of interventions: early detection, prevention, potential therapeutic targets or personalised recommendations – the possibilities are immense,’’ says Charna Dibner, a professor in the Department of Surgery and a specialist in circadian rhythms in metabolic disorders, . ‘‘This is why we carried out a detailed analysis of the blood profiles of patients recruited in four European countries and confirmed some of our results on a mouse model of the disease.’’

Dibner led the studies along with Pierre Maechler, a professor in the Department of Cell Physiology and Metabolism, at the UNIGE Faculty of Medicine, and members of the Diabetes Faculty Centre.

Chronobiology to better identify diabetes

The team carried out a ‘‘lipidomic’’ analysis of two groups of patients in order to establish the profile, over a 24-hour cycle, of multiple lipids present in the blood and adipose tissues. ‘‘The differences between the lipid profiles of type 2 diabetics and people without diabetes are particularly pronounced in the early morning, when there is an increase in certain toxic lipids,’’ explains Dibner. ‘‘Why? We don’t know yet. But this could be a marker of the severity of diabetes and paves the way for personalised care according to each patient’s specific chronotype.”

And implications go beyond diabetes: if samples are taken at very different times of the day, the results can be distorted and give contradictory results. ‘‘It’s the same thing in the clinic: an examination carried out in the morning or evening, or a treatment taken at different times, can have an impact on diagnosis and even on the effectiveness of treatments.’’

A crutch for beta cells

Charna Dibner and Pierre Maechler extended their lipidomic analyses to include not only people with type 2 diabetes but also a mouse model of pre-diabetes and patients who had lost around half their insulin-producing beta cells after a surgery. ‘‘We discovered that a type of lipid, lysoPIs, increases when there is a sharp decrease in functional β cells, even before the onset of clinical symptoms of diabetes.’’

The scientists then administered lysoPI to diabetic mice and observed an increase in insulin production. ‘‘The same phenomenon occurred in vitro, on pancreatic cells from diabetic patients,’’ adds Pierre Maechler. ‘‘The lysoPIs therefore have the capacity to reinforce insulin secretion by acting as a crutch when the number of beta cells decreases or when these cells malfunction. Yet, certain foods, such as legumes, naturally contain lysoPI precursors.’’

By bringing to light the unsuspected role of lysoPIs, researchers will be able to explore new avenues opened by their discoveries. The development of dietary supplements or even molecules specific to lysoPI receptors could be an interesting strategy for controlling diabetes, as could taking better account of the chronobiological profiles of patients. Diabetes is a complex disease that calls for much more personalised management than is currently the case.

Source: University of Geneva

Mobile Phone Use Linked to Lower Sperm Count and Concentration

Photo by Ketut Subiyanto on Pexels

While various environmental and lifestyle factors have been proposed to explain the decline in semen quality observed over the last fifty years, the role of mobile phones has yet to be demonstrated. In a major cross-sectional study, researchers in Switzerland showed that frequent use of mobile phones is associated with a lower sperm concentration and total sperm count, although causation cannot be established. No association was seen between mobile phone use and low sperm motility and morphology. Read the results in Fertility & Sterility.

Semen quality is determined by the assessment of parameters such as sperm concentration, total sperm count, sperm motility and sperm morphology. According to the values established by the World Health Organization (WHO), a man will most probably take more than one year to conceive a child if his sperm concentration is below 15 million/mL, with the odds of pregnancy will decrease if the sperm concentration is below 40 million/mL.

Many studies have shown that semen quality has decreased over the last fifty years. Sperm count is reported to have dropped from an average of 99 million sperm/mL to 47 million/mL. This phenomenon is thought to be the result of a combination of environmental factors (endocrine disruptors, pesticides, radiation) and lifestyle habits (diet, alcohol, stress, smoking).

Assessing the impact of mobile phones

Is the mobile phone also to blame? After conducting the first national study (2019) on the semen quality of young men in Switzerland, a team from the University of Geneva (UNIGE) has published the largest cross-sectional study on this topic. It is based on data from 2886 Swiss men aged 18 to 22, recruited between 2005 and 2018 at six military conscription centres.

In collaboration with the Swiss Tropical and Public Health Institute (Swiss TPH), scientists studied the association between semen parameters of 2886 men and their use of mobile phones. ‘‘Men completed a detailed questionnaire related to their lifestyle habits, their general health status and more specifically the frequency at which they used their phones, as well as where they placed it when  not in use,’’ explains Serge Nef, full professor in the Department of Genetic Medicine and Development at the UNIGE Faculty of Medicine and at the SCAHT – Swiss Centre for Applied Human Toxicology, who co-directed the study.

These data revealed an association between frequent use and lower sperm concentration. The median sperm concentration was significantly higher in the group of men who did not use their phone more than once a week (56.5 million/mL) compared with men who used their phone more than 20 times a day (44.5 million/mL). This difference corresponds to a 21% decrease in sperm concentration for frequent users (> 20 times/day) compared to rare users (< once/day).

Is 4G less harmful than 2G?

This inverse association was found to be more pronounced in the first study period (2005-2007) and gradually decreased with time (2008-2011 and 2012-2018). ‘‘This trend corresponds to the transition from 2G to 3G, and then from 3G to 4G, that has led to a reduction in the transmitting power of phones,’’ explains Martin RÖÖsli, associate professor at Swiss TPH.

‘‘Previous studies evaluating the relationship between the use of mobile phones and semen quality were performed on a relatively small number of individuals, rarely considering lifestyle information, and have been subject to selection bias, as they were recruited in fertility clinics. This has led to inconclusive results,’’ explains Rita Rahban, senior researcher and teaching assistant in the Department of Genetic Medicine and Development in the Faculty of Medicine at the UNIGE and at the SCAHT, first author and co-leader of the study.

It doesn’t matter where you put your phone

Data analysis also seems to show that the position of the phone – for example, in a trouser pocket – was not associated with lower semen parameters. ‘‘However, the number of people in this cohort indicating that they did not carry their phone close to their body was too small to draw a really robust conclusion on this specific point,’’ adds Rita Rahban.

This study, like most epidemiologic studies investigating the effects of mobile phone use on semen quality, relied on self-reported data, which is a limitation. By doing so, the frequency of use reported by the individual was assumed to be an accurate estimate of exposure to electromagnetic radiation. To address this limitation, a study funded by the Federal Office for the Environment (FOEN) was launched in 2023. Its aim is to directly and accurately measure exposure to electromagnetic waves, as well as the types of use – calls, web navigation, sending messages – and to assess their impact on male reproductive health and fertility potential. The data will be collected using an application that each future participant will download to their mobile phone. The research team is actively recruiting participants for this study.

The aim is also to better describe the mechanism of action behind these observations. ‘‘Do the microwaves emitted by mobile phones have a direct or indirect effect? Do they cause a significant increase in temperature in the testes? Do they affect the hormonal regulation of sperm production? This all remains to be discovered,’’ concludes Rita Rahban.

Source: University of Geneva

Removing Largest Serving Sizes of Wine Decreases Alcohol Consumption, Study Finds

When pubs, bars and restaurants in England removed their largest size of wine sold by the glass, consumers drank less alcohol

Photo from Pixabay CC0

Alcohol consumption is the fifth largest contributor to premature death and disease globally. Many cues in physical and economic environments influence alcohol consumption across populations. One proposed intervention to excessive alcohol consumption is reducing the size of servings of alcoholic drinks sold by the glass, but there has been no real-world evidence for the effectiveness of this.

In the new study, researchers asked 21 licensed premises in England to remove from their menus their largest serving of wine by the glass – usually 250mL – for four weeks. The researchers then tracked the total volume of wine, beer and cider sold by each establishment.

Over the course of the four weeks, the total volume of wine sold by the licensed premises decreased by 7.6%, and there was no overall increase in beer and cider sales. There was an increase in the sales of smaller servings of wine by the glass – generally 125mL and 175mL – but no impact on sales of wine by the bottle or beer or cider sales.

“This suggests that this is a promising intervention for decreasing alcohol consumption across populations, which merits consideration as part of alcohol licensing regulations,” the authors say.

Marteau adds, “Removing the largest serving size of wine by the glass in 21 licensed premises reduced the volume of wine sold, in keeping with the wealth of research showing smaller serving sizes reduce how much we eat. This could become a novel intervention to improve population health by reducing how much we drink.”

Reduced Blood Lead Levels Tied to Lower Blood Pressure

Credit: Pixabay CC0

Researchers found that small declines in blood lead levels were associated with long-term cardiovascular health improvements in American Indian adults. Participants who had the greatest reductions in blood lead levels saw their systolic blood pressure fall by about 7mmHg, comparable to the effects of antihypertensives.

The findings as reported from researchers at Columbia University Mailman School of Public Health and NIEHS and NHLBI are published in the Journal of the American Heart Association.

“This is a huge win for public health,” said senior author Anne E. Nigra, PhD, assistant professor of environmental health sciences at Columbia Mailman School of Public Health.

“We saw that even small decreases in a person’s blood lead levels can have meaningful health outcomes.”

Nigra and her co- authors, including Wil Lieberman-Cribbin, MPH, also at Columbia Mailman School, credit these improvements in large part to public health and policy changes that have occurred over the last few decades.

In addition to seeing improvements in systolic blood pressure, the investigators found that reductions in blood lead levels were associated with reductions in a marker associated with hypertrophic cardiomyopathy and heart failure.

To conduct this research, investigators partnered with 285 American Indian adults through an extension of the Strong Heart Study, the largest study following cardiovascular health outcomes and risk factors among American Indian adults.

The researchers looked at blood lead levels and blood pressure readings over time in participants living in one of four tribal communities. Lead was first measured in blood collected during the 1997–1999 study visit and again in blood collected during a follow-up visit between 2006–2009.

During this time, participants’ blood pressure was taken and they participated in medical exams, including echocardiographs to assess their heart’s structure and function. Multiple factors were controlled for, including social variables, cardiovascular disease risks, and medical history.

At the start of the study, the average blood lead level was 2.04µg/dL. Throughout the study, the average blood lead level fell by 0.67µg/dL, or 33%.

The most significant changes, categorized by participants with average starting blood lead levels of 3.21 µg/dL and who experienced reductions of about 1.78 µg/dL, or 55%, were linked to a 7mmHg reduction in systolic blood pressure.

“This is a sign that whatever is happening in these communities to reduce blood lead levels is working,” said Mona Puggal, MPH, an epidemiologist in the Division of Cardiovascular Sciences at the National Heart, Lung, and Blood Institute (NHLBI). “The reductions in blood pressure are also comparable to improvements you would see with lifestyle changes, such as getting 30 minutes of daily exercise, reducing salt intake, or losing weight.”

The reductions in blood lead levels observed in the study are similar to those seen in the general US population following policies and efforts implemented within the past 50 years to reduce lead exposure through paint, gasoline, water, plumbing, and canned items.

Source: Columbia University’s Mailman School of Public Health

.

Amnesia from Head Injury Reversed in Early Mouse Study

Photo by Olga Guryanova on Unsplash

A mouse-based study to investigate memory loss in people who experience repeated head impacts, such as athletes, suggests the condition could potentially be reversed. The research in mice finds that amnesia and poor memory following head injury is due to inadequate reactivation of neurons involved in forming memories.

The study, conducted by researchers at Georgetown University Medical Center in collaboration with Trinity College Dublin, Ireland, is reported in the Journal of Neuroscience.

Importantly for diagnostic and treatment purposes, the researchers found that the memory loss attributed to head injury was not a permanent pathological event driven by a neurodegenerative disease.

Indeed, the researchers could reverse the amnesia to allow the mice to recall the lost memory, potentially allowing cognitive impairment caused by head impact to be clinically reversed.

The Georgetown investigators had previously found that the brain adapts to repeated head impacts by changing the way the synapses in the brain operate, which can cause trouble in memory storage and retrieval.

In their new study, investigators were able to trigger mice to remember memories that had been forgotten due to head impacts.

“Our research gives us hope that we can design treatments to return the head-impact brain to its normal condition and recover cognitive function in humans that have poor memory caused by repeated head impacts,” says the study’s senior investigator, Mark Burns, PhD, a professor and Vice-Chair in Georgetown’s Department of Neuroscience and director of the Laboratory for Brain Injury and Dementia.

In the new study, the scientists gave two groups of mice a new memory by training them in a test they had never seen before. One group was exposed to a high frequency of mild head impacts for one week (similar to contact sport exposure in people) and one group were controls that didn’t receive the impacts. The impacted mice were unable to recall the new memory a week later.

“Most research in this area has been in human brains with chronic traumatic encephalopathy (CTE), which is a degenerative brain disease found in people with a history of repetitive head impact,” said Burns.

“By contrast, our goal was to understand how the brain changes in response to the low-level head impacts that many young football players regularly experience.”

Researchers have found that, on average, college football players receive 21 head impacts per week with defensive ends receiving 41 head impacts per week.

The number of head impacts to mice in this study were designed to mimic a week of exposure for a college football player, and each single head impact by itself was extraordinarily mild.

Using genetically modified mice allowed the researchers to see the neurons involved in learning new memories, and they found that these memory neurons (the “memory engram”) were equally present in both the control mice and the experimental mice.

To understand the physiology underlying these memory changes, study first author Daniel P. Chapman, PhD, said, “We are good at associating memories with places, and that’s because being in a place, or seeing a photo of a place, causes a reactivation of our memory engrams. This is why we examined the engram neurons to look for the specific signature of an activated neuron. When the mice see the room where they first learned the memory, the control mice are able to activate their memory engram, but the head impact mice were not. This is what was causing the amnesia.”

The researchers were able to reverse the amnesia to allow the mice to remember the lost memory using lasers to activate the engram cells.

“We used an invasive technique to reverse memory loss in our mice, and unfortunately this is not translatable to humans,” Burns adds.

“We are currently studying a number of non-invasive techniques to try to communicate to the brain that it is no longer in danger, and to open a window of plasticity that can reset the brain to its former state.”

Source: Georgetown University Medical Center

  1. Daniel P. Chapman, Sarah D. Power, Stefano Vicini, Tomás J. Ryan, Mark P. Burns. Amnesia after repeated head impact is caused by impaired synaptic plasticity in the memory engramThe Journal of Neuroscience, 2024; e1560232024 DOI: 10.1523/JNEUROSCI.1560-23.2024

Scientists may have Found out How Rapid-acting Antidepressants Work

Photo by Marek Piwnicki

Rapid-acting antidepressants, including ketamine, scopolamine and psilocybin, have been found to have immediate and lasting positive effects on mood in patients with major depressive disorder but how these effects arise is unknown. New research led by the University of Bristol and published in Science Translational Medicine explored their neuropsychological effects and found that all three of these drugs can modulate affective biases associated with learning and memory.

Negative affective biases are a core feature of major depressive disorder. Affective biases occur when emotions alter how the brain processes information and negative affective biases are thought to contribute to the development and continuation of depressed mood.

The research team used an affective bias test, based on an associative learning task, to investigate the effects of rapid-acting antidepressants (RAADs) in rats.

They found that all the treatments were able to reduce negative affective biases associated with past experiences but there were additional characteristics of the dissociative anaesthetic, ketamine, and the serotonergic psychedelic, investigational COMP360 psilocybin (Compass Pathways’ proprietary formulation of synthetic psilocybin), which could explain why the effects of a single treatment can be long-lasting.

The findings suggest that these sustained effects are due to adaptive changes in the brain circuits which control affective biases, and these can influence how past experiences are remembered.

The effects at low doses were very specific to affective bias modulation and were localised to the prefrontal cortex of the brain, a region known to play an important role in mood.

Emma Robinson, Professor of Psychopharmacology in the School of Physiology, Pharmacology & Neuroscience at Bristol, and lead author, said: “Using a behavioural task we showed that drugs that are believed to have rapid and sustained benefits in depressed patients, specifically modulate affective biases associated with past experiences, something which we think is really important for understanding why they can improve a patient’s mood so quickly.

“We also found differences in how ketamine, scopolamine and COMP360 psilocybin interact with these neuropsychological mechanisms which may explain why the effects of a single treatment in human patients can be long-lasting, days (ketamine) to months (psilocybin).

“By using an animal model, we have been able to investigate these important interactions with learning and memory processes and neural plasticity and propose a two-stage model that may explain the effects we observe.”

In the task, each animal learnt to associate a specific digging material with a food reward under either treatment or control conditions.

The treatment condition is designed to generate a change in the animal’s affective state and a choice test is used to quantify the affective bias this generates.

Acute treatment with the RAADs ketamine, scopolamine, or psilocybin prevented the retrieval of the negative affective bias induced in this model.

However, the most exciting finding was at 24 hours after treatment when low, but not high, doses of ketamine and psilocybin led to a re-learning effect where the negatively biased memory was retrieved with a more positive affective valence.

Only psilocybin, but not ketamine or scopolamine treatment also positively biased new experiences.

Exploring in more detail the re-learning effects of ketamine in the studies, the researchers found they were protein synthesis-dependent, localised to the medial prefrontal cortex and could be modulated by cue-reactivation, consistent with their predictions of experience-dependent neural plasticity.

The study’s findings propose a neuropsychological mechanism that may explain both the immediate and sustained effects of RAADs, potentially linking their effects on neural plasticity with mood.

Source: University of Bristol

How Calorie Restriction Slows Aging in the Brain

Photo by Pixabay

Restricting calories is known to improve health and increase lifespan, but much of how it does so remains a mystery, especially in regard to how it protects the brain. Now, scientists from the Buck Institute for Research on Aging have uncovered a role for a gene called OXR1 that is necessary for the lifespan extension seen with dietary restriction and is essential for healthy brain aging.

“When people restrict the amount of food that they eat, they typically think it might affect their digestive tract or fat buildup, but not necessarily about how it affects the brain,” said Kenneth Wilson, PhD, Buck postdoc and first author of the study, published in Nature Communications. “As it turns out, this is a gene that is important in the brain.”

The team additionally demonstrated a detailed cellular mechanism of how dietary restriction can delay aging and slow the progression of neurodegenerative diseases. The work, done in fruit flies and human cells, also identifies potential therapeutic targets to slow aging and age-related neurodegenerative diseases.

“We found a neuron-specific response that mediates the neuroprotection of dietary restriction,” said Buck Professor Pankaj Kapahi , PhD, co-senior author of the study. “Strategies such as intermittent fasting or caloric restriction, which limit nutrients, may enhance levels of this gene to mediate its protective effects.”

“The gene is an important brain resilience factor protecting against aging and neurological diseases,” said Buck Professor Lisa Ellerby, PhD, co-senior author of the study.

Understanding variability in response to dietary restriction

Members of the team have previously shown mechanisms that improve lifespan and healthspan with dietary restriction, but it was not clear why there is so much variability in response to reduced calories across individuals and different tissues. This project was started to understand why different people respond to diets in different ways.

The team began by scanning about 200 strains of flies with different genetic backgrounds. The flies were raised with two different diets, either with a normal diet or with dietary restriction, which was only 10% of normal nutrition. Researchers identified five genes which had specific variants that significantly affected longevity under dietary restriction. Of those, two had counterparts in human genetics.

The team chose one gene to explore thoroughly, called “mustard” (mtd) in fruit flies and “Oxidation Resistance 1” (OXR1) in humans and mice. The gene protects cells from oxidative damage, but the mechanism for how this gene functions was unclear. The loss of OXR1 in humans results in severe neurological defects and premature death. In mice, extra OXR1 improves survival in a model of amyotrophic lateral sclerosis (ALS).

The link between brain aging, neurodegeneration and lifespan

To figure out how a gene that is active in neurons affects overall lifespan, the team did a series of in-depth tests. They found that OXR1 affects a complex called the retromer, which is a set of proteins necessary for recycling cellular proteins and lipids. “The retromer is an important mechanism in neurons because it determines the fate of all proteins that are brought into the cell,” said Wilson. Retromer dysfunction has been associated with age-related neurodegenerative diseases that are protected by dietary restriction, specifically Alzheimer’s and Parkinson’s diseases.

Overall, their results told the story of how dietary restriction slows brain aging by the action of mtd/OXR1 in maintaining the retromer. “This work shows that the retromer pathway, which is involved in reusing cellular proteins, has a key role in protecting neurons when nutrients are limited,” said Kapahi. The team found that mtd/OXR1 preserves retromer function and is necessary for neuronal function, healthy brain aging, and lifespan extension seen with dietary restriction.

“Diet is influencing this gene. By eating less, you are actually enhancing this mechanism of proteins being sorted properly in your cells, because your cells are enhancing the expression of OXR1,” said Wilson.

The team also found that boosting mtd in flies caused them to live longer, leading researchers to speculate that in humans excess expression of OXR1 might help extend lifespan. “Our next step is to identify specific compounds that increase the levels of OXR1 during aging to delay brain aging,” said Ellerby.

“Hopefully from this we can get more of an idea of why our brains degenerate in the first place,” said Wilson.

“Diet impacts all the processes in your body,” he said. “I think this work supports efforts to follow a healthy diet, because what you eat is going to affect more than you know.”

Source: Buck Institute for Research on Aging

Researchers Urge Caution in Co-prescribing Potency Drugs and Nitrates

Pexels Photo by Freestocksorg

Co-prescribing potency drugs such as Viagra and organic nitrates for angina is associated with a 35–40% increased mortality risk and about 70% higher risk of heart attack and heart failure. This is according to a Swedish registry study published in the Journal of the American College of Cardiology. The Swedish researchers are now urging caution.

Drugs for erectile dysfunction or impotence containing phosphodiesterase inhibitors type 5 are contraindicated in the treatment of angina with organic nitrates. Because the two types of drugs enhance each other’s antihypertensive effect, they can cause serious side effects, including death, if taken together.

But many people who treat angina with organic nitrates use the medication as emergency relief for a sudden onset of angina. The medication is quickly absorbed by the body, exerts its effect, and then breaks down quickly again. It is not usually a permanent treatment, although maintenance treatment is possible. 

Does not necessarily indicate an increased risk

Potency drugs are also taken as needed, which theoretically makes it possible to separate the two treatments in time to avoid side effects. If patients are aware of these factors, co-prescribing does not necessarily mean an increased risk.

Previous studies have shown that an increasing number of men who treat their angina with organic nitrates are also prescribed potency drugs. However, there is no evidence that side effects have increased. 

The picture is not entirely clear, as it has also been shown that type 5 phosphodiesterase inhibitors for men with cardiovascular disease without angina reduce the risk of death and heart failure.

“There is an increasing demand for medication for erectile dysfunction from men with cardiovascular disease. And even if these drugs are beneficial for most men with cardiovascular disease, those who are also treated with nitrates need to consider the benefits of the drug against the cardiovascular risks,” says first author Ylva Trolle Lagerros, Associate Professor at the Department of Medicine at Karolinska Institutet.

To find out what the actual risk of concurrent prescribing is, the researchers used Swedish health registers between 2005 and 2013. They found nearly 61 500 men who had been prescribed organic nitrates, of which just over 5700 had also been prescribed one of the potency drugs in question. A clear majority of those who had a prescription for both medications used nitrates as an emergency treatment only.

Adjusted for differences

The men who received the drugs were on average nine years younger and significantly healthier than those who did not receive them. The researchers therefore had to adjust for these and other differences.

The adjusted results show that co-prescribing potency drugs with type 5 phosphodiesterase inhibitors and organic nitrates is associated with a 35–40% increased risk of death. In addition, the researchers show an approximately 70% increased risk of heart attack and heart failure. This suggests that the theoretical separation in time of the treatments does not seem to work fully.

“We want to point out the importance of careful and patient-centered consideration before prescribing this type of potency medication to men treated with nitrates,” says Ylva Trolle Lagerros.

Source: Karolinska Institutet

Study Finds Screen Time for Toddlers is a Bad Idea

Photo by Helena Lopes on Unsplash

Babies and toddlers exposed to television or video viewing may be more likely to exhibit atypical sensory behaviours, such as being disengaged and disinterested in activities, seeking more intense stimulation in an environment, or being overwhelmed by sensations like loud sounds or bright lights, according to data from researchers at Drexel’s College of Medicine published in the journal JAMA Pediatrics.

According to the researchers, children exposed to greater TV viewing by their second birthday were more likely to develop atypical sensory processing behaviours, such as “sensation seeking” and “sensation avoiding,” as well as “low registration” – being less sensitive or slower to respond to stimuli, such as their name being called, by 33 months old.

Sensory processing skills reflect the body’s ability to respond efficiently and appropriately to information and stimuli received by its sensory systems, such as what the toddler hears, sees, touches, and tastes.

The team pulled 2011-2014 data on television or DVD-watching by babies and toddlers at 12- 18- and 24-months from the National Children’s Study of 1471 children (50% male) nationwide.

Sensory processing outcomes were assessed at 33 months using the Infant/Toddler Sensory Profile (ITSP), a questionnaire completed by parents/caregivers, designed to give insights on how children process what they see, hear and smell, etc.

ITSP subscales examine children’s patterns of low registration, sensation seeking, such as excessively touching or smelling objects; sensory sensitivity, such as being overly upset or irritated by lights and noise; and sensation avoiding – actively trying to control their environment to avoid things like having their teeth brushed. Children score in “typical,” “high” or “low” groups based on how often they display various sensory-related behaviours. Scores were considered “typical” if they were within one standard deviation from the average of the ITSP norm.

Measurements of screen exposure at 12-months were based on caregiver responses to the question: “Does your child watch TV and/or DVDs? (yes/no),” and at 18- and 24- months based on the question: “Over the past 30 days, on average, how many hours per day did your child watch TV and/or DVDs?”

The findings suggest:

  • At 12 months, any screen exposure compared to no screen viewing was associated with a 105% greater likelihood of exhibiting “high” sensory behaviours instead of “typical” sensory behaviours related to low registration at 33 months
  • At 18 months, each additional hour of daily screen time was associated with 23% increased odds of exhibiting “high” sensory behaviours related to later sensation avoiding and low registration.
  • At 24 months, each additional hour of daily screen time was associated with a 20% increased odds of “high” sensation seeking, sensory sensitivity, and sensation avoiding at 33 months.

The researchers adjusted for age, whether the child was born prematurely, caregiver education, race/ethnicity and other factors, such as how often the child engages in play or walks with the caregiver.

The findings add to a growing list of concerning health and developmental outcomes linked to screen time in infants and toddlers, including language delay, autism spectrum disorder, behavioural issues, sleep struggles, attention problems and problem-solving delays.

“This association could have important implications for attention deficit hyperactivity disorder and autism, as atypical sensory processing is much more prevalent in these populations,” said lead author Karen Heffler, MD, an associate professor of Psychiatry in Drexel’s College of Medicine. “Repetitive behaviour, such as that seen in autism spectrum disorder, is highly correlated with atypical sensory processing. Future work may determine whether early life screen time could fuel the sensory brain hyperconnectivity seen in autism spectrum disorders, such as heightened brain responses to sensory stimulation.”

Atypical sensory processing in kids with autism spectrum disorder (ASD) and ADHD manifests in a range of detrimental behaviours. In children with ASD, greater sensation seeking or sensation avoiding, heightened sensory sensitivity and low registration have been associated with irritability, hyperactivity, eating and sleeping struggles, as well as social problems. In kids with ADHD, atypical sensory processing is linked to trouble with executive function, anxiety and lower quality of life.

“Considering this link between high screen time and a growing list of developmental and behavioural problems, it may be beneficial for toddlers exhibiting these symptoms to undergo a period of screen time reduction, along with sensory processing practices delivered by occupational therapists,” said Heffler.

The American Academy of Pediatrics (AAP) discourages screen time for babies under 18–24 months. Live video chat is considered by the AAP to be okay, as there may be benefit from the interaction that takes place. AAP recommends time limitations on digital media use for children ages two to five years to typically no more than one hour per day.

“Parent training and education are key to minimising, or hopefully even avoiding, screen time in children younger than two years,” said senior author David Bennett, PhD, a professor of Psychiatry in Drexel’s College of Medicine.

Source: Drexel’s College of Medicine