Year: 2022

Non-toxic Liquid Metal Breaks Down Aluminium Medical Implants

Photo by Louise Reed on Unsplash

By taking advantage of a phenomenon that is usually an engineering headache, MIT researchers have designed a liquid metal to safely disintegrate metal medical implants and drug depots when they are not needed anymore.

In their work published in Advanced Materials, the researchers showed that aluminium biomedical devices can be disintegrated by exposing them to a liquid metal known as eutectic gallium-indium (EGaIn). In practice, this might work by painting the liquid onto staples used to hold skin together, for example, or by administering EGaIn microparticles to patients.

According to the researchers, disintegrating metal devices in this way could eliminate the need for surgical or endoscopic removal procedures.

“It’s a really dramatic phenomenon that can be applied to several settings,” says senior author Giovanni Traverso, assistant professor of mechanical engineering at MIT and a gastroenterologist at Brigham and Women’s Hospital. “What this enables, potentially, is the ability to have systems that don’t require an intervention such as an endoscopy or surgical procedure for removal of devices.”

Breaking down metals

For several years, Traverso’s lab has been working on ingestible devices that could remain in the digestive tract for days or weeks, releasing drugs on a specific schedule.

Most of those devices are made from polymers, but recently the researchers have been exploring the possibility of using metals, which are stronger and more durable. However, one of the challenges of delivering metal devices is finding a way to remove them once they’re no longer needed.

Right now, removing the staples can actually induce more tissue damage

Vivian Feig, MIT POSTDOC & First Author

To create devices that could be broken down on demand inside the body, the MIT turned to liquid metal embrittlement. This process has been well-studied as a source of failure in metal structures, including those made from zinc and stainless steel. It is why metal liquids such as mercury are not allowed on aircraft.

“It’s known that certain combinations of liquid metals can actually get into the grain boundaries of solid metals and cause them to dramatically weaken and fail,” says first author Vivian Feig, an MIT postdoc. “We wanted to see if we could harness that known failure mechanism in a productive way to build these biomedical devices.”

One room-temperature liquid metal that can induce embrittlement is gallium. For this study, the researchers used eutectic gallium-indium, an alloy of gallium that scientists have explored for a variety of applications in biomedicine as well as energy and flexible electronics.

For the devices themselves, the researchers chose to use aluminium, which is known to be susceptible to embrittlement when exposed to gallium.

Gallium weakens solid metals such as aluminium in two ways. First, it can diffuse through the grain boundaries of the metal – border lines between the crystals that make up the metal – causing pieces of the metal to break off. The MIT team showed that they could harness this phenomenon by designing metals with different types of grain structures, allowing the metals to break into small pieces or to fracture at a given point.

Gallium also prevents aluminium from forming a protective oxide layer on its surface, which increases the metal’s exposure to water and enhances its degradation.

The MIT team showed that after they painted gallium-indium onto aluminium devices, the metals would disintegrate within minutes. The researchers also created nanoparticles and microparticles of gallium-indium and showed that these particles, suspended in fluid, could also break down aluminium structures.

On-demand disintegration

While the researchers began this effort as a way to create devices that could be broken down in the gastrointestinal tract, they soon realised that it could also be applied to other biomedical devices such as staples and stents.

To demonstrate GI applications, the researchers designed a star-shaped device, with arms attached to a central elastomer by a hollow aluminium tube. Drugs can be carried in the arms, and the shape of the device helps it be retained in the GI tract for an extended period of time. In a study in animals, the researchers showed that this kind of device could be broken down in the GI tract upon treatment with gallium-indium.

The researchers then created aluminium staples and showed that they could be used to hold tissue together, then dissolved with a coating of gallium-indium.

“Right now, removing the staples can actually induce more tissue damage,” Feig says. “We showed that with our gallium formulation we can just paint it on the staples and get them to disintegrate on-demand instead.”

The researchers also showed that an aluminium stent they designed could be implanted in oesophageal tissue, then broken down by gallium-indium.

Currently, oesophageal stents are either left in the body permanently or endoscopically removed when no longer needed. Such stents are often made from metals such as nitinol, an alloy of nickel and titanium. The researchers are now working to see if they could create dissolvable devices from nitinol and other metals.

“An exciting thing to explore from a materials science perspective is: Can we take other metals that are more commonly used in the clinic and modify them so that they can become actively triggerable as well?” Feig says.

Initial toxicity studies in rodents showed that gallium-indium was non-toxic even at high doses. However, more study would be needed to ensure it would be safe to administer to patients, the researchers say.

Source:

High-fat Diet can Cause Pain Sensitivity without Obesity or Diabetes

Woman holding her wrist in pain

A new study using a mouse model suggests that a short-term exposure to a high-fat diet may be linked to pain sensations, such as from a light touch, even without a prior injury or a pre-existing condition like obesity or diabetes. This finding may help in part explain the severity of the opioid crisis.

The study, published in Scientific Reports, compared the effects of eight weeks of different diets on two cohorts of mice. One group received normal chow, while the other was fed a high-fat diet in a way that did not precipitate the development of obesity or high blood sugar, both of which are conditions that can result in diabetic neuropathy and other types of pain.

The researchers found that the high-fat diet induced hyperalgesic priming – a neurological change that represents the transition from acute to chronic pain – and allodynia, which is pain resulting from stimuli that do not normally provoke pain.

“This study indicates you don’t need obesity to trigger pain; you don’t need diabetes; you don’t need a pathology or injury at all,” said Dr Michael Burton, assistant professor of neuroscience and corresponding author of the article. “Eating a high-fat diet for a short period of time is enough – a diet similar to what almost all of us in the US eat at some point.”

The study also compared obese, diabetic mice with those that just experienced dietary changes.

“It became clear, surprisingly, that you don’t need an underlying pathology or obesity. You just needed the diet,” Burton said. “This is the first study to demonstrate the influential role of a short exposure to a high-fat diet to allodynia or chronic pain.”

Diet itself caused markers of neuronal injury.

Western diets are rich in fats – in particular saturated fats, which have proved to be responsible for an epidemic of obesity, diabetes and associated conditions. Individuals who consume high amounts of saturated fats – like butter, cheese and red meat – have high amounts of free fatty acids circulating in their bloodstream that in turn induce systemic inflammation.

Recently, scientists have shown that these high-fat diets also increase existing mechanical pain sensitivity in the absence of obesity, and that they can also aggravate pre-existing conditions or imped injury recovery. To date, no studies have explained how high-fat diets alone can be a sensitising factor in inducing pain from nonpainful stimuli, such as a light touch on the skin, Burton said.

“We’ve seen in the past that, in models of diabetes or obesity, only a subsection of the people or animals experience allodynia, and if they do, it varies across a spectrum, and it isn’t clear why,” Burton said. “We hypothesized that there had to be other precipitating factors.”

The researchers examined blood levels of fatty acids in the mice. They found that a fatty acid called palmitic acid, the most common saturated fatty acid in animals, binds to a particular receptor on nerve cells, a process that results in inflammation and mimics injury to the neurons.

“The metabolites from the diet are causing inflammation before we see pathology develop,” Burton said. “Diet itself caused markers of neuronal injury.

The mechanism behind this transition is important because it is the presence of chronic pain – from whatever source – that is fuelling the opioid epidemic

“Now that we see that it’s the sensory neurons that are affected, how is it happening? We discovered that if you take away the receptor that the palmitic acid binds to, you don’t see that sensitising effect on those neurons. That suggests there’s a way to block it pharmacologically.”

Burton said the next step will be to focus on the neurons themselves – how they are activated and how injuries to them can be reversed. It is part of a larger effort to understand better the transition from acute to chronic pain.

“The mechanism behind this transition is important because it is the presence of chronic pain – from whatever source – that is fuelling the opioid epidemic,” he said. “If we figure out a way to prevent that transition from acute to chronic, it could do a lot of good.”

Burton said he hopes his research encourages health care professionals to consider the role diet plays in influencing pain.

“The biggest reason we do research like this is because we want to understand our physiology completely,” he said. “Now, when a patient goes to a clinician, they treat a symptom, based off of an underlying disease or condition. Maybe we need to pay more attention to how the patient got there: Does the patient have diabetes-induced or obesity-induced inflammation; has a terrible diet sensitised them to pain more than they realized? That would be a paradigm shift.”

Source: University of Texas at Dallas

Cannabis Worsens Peripheral Artery Bypass Outcomes

Photo by RODNAE Productions from Pexels

Cannabis use may negatively impact outcomes in peripheral artery bypass (PAB) surgery, suggests a study published in Annals of Vascular Surgery.

Researchers analysed more than 11 000 available cases to review patient cannabis use and postoperative outcomes for lower extremity bypass after 30 days and one year. The minimally invasive PAB procedure uses a vein or synthetic to tube to divert blood around a narrowed or blocked artery in a leg.

Results reveal that patients who used cannabis prior to lower extremity bypass had decreased patency, meaning the graft had a higher chance of becoming blocked or occluded, and were 1.25 times more likely to require amputation one year after surgery. Cannabis users were also 1.56 times more likely to use opioids after discharge.

“The findings show a need for screening for cannabis use and open conversations between patients and clinicians to help inform preoperative risk assessment and decision-making for lower extremity bypass,” said senior author Peter Henke, MD, FACS, FAHA

“While its exact mechanisms are unclear, cannabis and its active compounds play a role in platelet function and microcirculation that may lead to decreased rates of limb salvage after lower extremity bypass,” Henke said.

Around 43% of individuals in the United States and Canada have used cannabis. Previous studies suggest cannabis use has effects on the cardiovascular system, including increased risk of heart attack and stroke. The study did not find any association with stroke or heart attack after lower extremity bypass.

While future study is needed to further understand cannabis’ full effect on outcomes, researchers note, the findings will help clinicians counsel patients who are undergoing vascular surgery.

“While past studies on the effects of cannabis use on pain response suggested an increase in pain tolerance after smoking cannabis, our studies and other contemporary findings show the opposite,” said Drew Braet, M.D., first author and integrated vascular surgery resident at U-M Health. “Given the increase in cannabis use and abuse in conjunction with the opioid epidemic, the results suggest a need for a better understanding of pain management for cannabis users who are having vascular surgery.”

Source: Michigan Medicine – University of Michigan

Delivery Method Affects Babies’ Vaccine Responses

Photo by Christian Bowen on Unsplash

The method by which a baby was delivered is associated with how its immune system will respond to pneumococcal and meningococcal vaccinations, according to a study in Nature Communications. After vaccination, babies born naturally had higher antibody levels than those born via Caesarian section.

Experts say the findings could help to inform conversations about C-sections between expectant mothers and their doctors, and shape the design of more tailored vaccination programmes.

Researchers studied the relationship between gut microbes and antibody levels after vaccination in a cohort of 120 babies, who were vaccinated at 8 and 12 weeks against lung infections and meningitis.

The researchers tracked the development of the gut microbiome in the child’s first year of life and their immune response to the vaccines by testing saliva samples at 12 and 18 months.

Research was carried out by a team from the University of Edinburgh, Spaarne Hospital and University Medical Centre in Utrecht and the National Institute for Public Health and the Environment in The Netherlands.

In the 101 babies tested for antibodies as a result of the vaccine that protects against lung infections, the investigators found double the antibody levels in babies delivered naturally compared with those delivered by C-section.

Breastfeeding was linked with 3.5 times higher antibody levels compared with formula-fed children who had been delivered naturally.

Levels of antibodies as a result of the vaccine that protects against meningitis were tested in 66 babies. Experts found the levels of antibodies were 1.7 times higher for naturally delivered babies, regardless of breastfeeding, compared with those delivered via C-section.

The gut microbiome is seeded at birth, developing rapidly over the first few months of life, and is influenced mostly by delivery mode, breastfeeding, and antibiotic use.

The team found a clear relationship between microbes in the gut of those babies and levels of antibodies.

For example, among a host of bacteria in the gut, high levels of two in particular — Bifidobacterium and E. Coli — were associated with a high antibody response to the vaccine that protects against lung infections.

High levels of E. Coli were also linked with a high antibody response to the vaccine that protects against meningitis.

The baby acquires the Bifidobacterium and E.coli bacteria through natural birth and human milk is needed to provide the sugars for these bacteria to thrive on.

The team concludes that the babies’ microbiome in early life contributes the immune system’s response to the vaccines and sets the level of protection against certain infections in childhood.

Vaccination schedules could also be adjusted based on mode of delivery or an analysis of the baby’s microbiome in the future, experts say.

Dr Emma de Koff, first author and microbiology trainee at the Amsterdam University Medical Center, said: “We expected to find a link between the gut microbiome and the babies’ vaccine responses, however we never thought to find the strongest effects in the first weeks of life.”

Professor Debby Bogaert study lead and Chair of Paediatric Medicine at the University of Edinburgh said “I think it is especially interesting that we identified several beneficial microbes to be the link between mode of delivery and vaccine responses. In the future, we may be able to supplement those bacteria to children born by C-section shortly after birth through, for example, mother-to-baby ‘faecal transplants’ or the use of specifically designed probiotics.”

European Medicines Agency Moves to Minimise JAK Inhibitor Risks

Photo by Myriam Zilles on Unsplash

The European Medicines Agency’s safety committee (PRAC) has recommended measures to minimise the risk of serious side effects associated with Janus kinase (JAK) inhibitors used to treat several chronic inflammatory disorders. These side effects include cardiovascular conditions, blood clots, cancer and serious infections.

The Committee recommended that these medicines should be used in the following patients only if no suitable treatment alternatives are available: those aged 65 years or above, those at increased risk of major cardiovascular problems (such as heart attack or stroke), those who smoke or have done so for a long time in the past and those at increased risk of cancer.

The Committee also recommended using JAK inhibitors with caution in patients with risk factors for blood clots in the lungs and in deep veins (venous thromboembolism, VTE) other than those listed above. Further, the doses should be reduced in some patient groups who may be at risk of VTE, cancer or major cardiovascular problems.

The recommendations follow a review of available data, including the final results from a clinical trial of the JAK inhibitor tofacitinib and preliminary findings from an observational study involving baricitinib, another JAK inhibitor. During the review, the PRAC sought advice from an expert group of rheumatologists, dermatologists, gastroenterologists and patient representatives.

The review confirmed tofacitinib increases the risk of major cardiovascular problems, cancer, VTE, serious infections and death due to any cause when compared with TNF-alpha inhibitors. The PRAC has now concluded that these safety findings apply to all approved uses of JAK inhibitors in chronic inflammatory disorders (rheumatoid arthritis, psoriatic arthritis, juvenile idiopathic arthritis, axial spondyloarthritis, ulcerative colitis, atopic dermatitis and alopecia areata).

The product information for JAK inhibitors used to treat chronic inflammatory disorders will be updated with the new recommendations and warnings. In addition, the educational material for patients and healthcare professionals will be revised accordingly. Patients who have questions about their treatment or their risk of serious side effects should contact their doctor.

More about the medicines

The Janus kinase inhibitors subject to this review are abrocitinib, filgotinib, baricitinib, upadacitinib and tofacitinib. These medicines are used to treat several chronic inflammatory disorders (rheumatoid arthritis, psoriatic arthritis, juvenile idiopathic arthritis, axial spondyloarthritis, ulcerative colitis, atopic dermatitis and alopecia areata). The active substances in these medicines work by blocking the action of enzymes known as Janus kinases. These enzymes play an important role in the process of inflammation that occurs in these disorders. By blocking the enzymes’ action, the medicines help reduce the inflammation and other symptoms of these disorders.

Some JAK inhibitors are used to treat myeloproliferative disorders; the review did not include these medicines. The review also did not cover the use of baricitinib in the short-term treatment of COVID, which is under assessment by EMA.

Source: European Medicines Agency

Hormonal Contraceptives’ Impacts on the Adolescent Brain

Photo by Reproductive Health Supplies Coalition on Unsplash

Hormonal contraceptives are safe and highly effective at preventing pregnancy, but their impact on the developing bodies of teenage girls, especially their brains, is not well understood.

New research in young rats links the synthetic hormones found in birth control pills, patches and injections with disordered signal transmission between cells in the prefrontal cortex, which is still developing during adolescence. Compared to control rats, the animals receiving hormonal contraceptives also produced higher levels of the stress hormone corticosterone, which is similar to cortisol in humans.

The Ohio State University scientists began investigating the prefrontal cortex, where mood is regulated, because some previous research has associated early adolescent use of hormonal contraceptives with adulthood depression risk. But the most important thing, the researchers said, is learning how birth control affects the developing brain so individuals can weigh the risks and benefits of their reproductive health choices.

“Birth control has had a major positive impact for women’s health and autonomy – so it’s not that we’re suggesting adolescents should not take hormonal contraceptives,” said senior study author Benedetta Leuner, associate professor of psychology at Ohio State.

“What we need is to be informed about what synthetic hormones are doing in the brain so we can make informed decisions – and if there are any risks, then that’s something that needs to be monitored. Then if you decide to use hormonal birth control, you would pay more attention to warning signs if you knew of any possible mood-related side effects.”

The research poster was presented to at Neuroscience 2022, the annual meeting of the Society for Neuroscience.

An estimated 2 in 5 teenage girls in the US have sexual intercourse between age 15 and 19, and the vast majority use a contraceptive, mostly condoms. Of those using birth control, almost 5% use hormonal contraceptives, also known as long-acting reversible contraceptives. These products are also prescribed to treat acne and heavy periods.

Despite their popularity, “there isn’t a lot known about how hormonal birth control influences the teen brain and behaviour,” said co-author Kathryn Lenz, associate professor of psychology at Ohio State. “Adolescence is a crucially under-investigated period of dramatic brain change and dramatic hormonal change that we really haven’t understood.”

The researchers gave a combination of synthetic estrogen and progesterone typically found in hormonal contraceptives to female rats for three weeks beginning about a month after they were born, an age equivalent to early adolescence in humans. Researchers confirmed the drugs disrupted the animals’ reproductive cycling — these birth control products work by stopping ovaries from producing hormones at levels necessary to generate eggs and making the uterine lining inhospitable for an egg to implant.

Blood samples showed the treated rats were producing more corticosterone than untreated animals, a sign that they were stressed. And after being subjected to and recovering from an experimental stressor, the treated rats’ corticosterone level remained high. Their adrenal glands were also larger, suggesting their stress hormone production was consistently higher than that of control animals.

An analysis of gene activation markers in the animals’ prefrontal cortex showed a decrease in excitatory synapses in that region of treated rats’ brains compared to controls, but no change to inhibitory synapses — a phenomenon that could set up an imbalance of normal signaling patterns and result in altered behavior. The loss of only excitatory synapses in the prefrontal cortex has been linked to exposure to chronic stress and depression in previous research.

“What this means for the function of particular circuits, we don’t know yet. But this gives us a clue of where to look next in terms of what the functional outcomes might be,” Lenz said.

The researchers are moving forward with additional studies targeting hormonal contraceptive effects on the brain between puberty and late adolescence – a tricky time to study the developing brain because it is undergoing constant change, Leuner said. The reasons behind the drugs’ effects are an open question, as well.

“These are synthetic hormones, so are they affecting the brain because of their synthetic properties, or are they affecting the brain because they’re blocking the naturally produced hormones?” she said. “It’s a difficult question to answer, but an important one.”

Source: Ohio State University

Scientists Discover that Leprosy has an Organ Regeneration Secret

Photo by Aldo Hernandez on Unsplash

Researchers say that leprosy may hold to the key to safe and effective organ regeneration, after discovering that leprosy can double the size of livers in armadillos by stimulating normal, healthy growth.

Their findings, published in the journal Cell Reports, reveal a previously unknown interaction of the leprosy bacterium with its host, in this study, an armadillo – the only one known one besides humans that the disease may manifest in.

The researchers found that leprosy appears to rewind the developmental clock of liver cells, effectively reprogramming them to be in an ‘adolescent’ state.

Regenerative medicine aims for ‘grown to order’ organs to replace those damaged by disease or age, but organ development is an extremely complex process which takes place in vivo and so far only limited progress has been made using in vitro models. The liver, a highly resilient organ, stops regenerating once it reaches its original size, making it difficult to study regeneration pathways.

Leprosy, also referred to as Hansen disease, is a chronic granulomatous infection generally caused by Mycobacterium leprae and Mycobacterium lepromatosis, both of which primarily affect the skin and peripheral nerves. It also has the ability to convert body tissues from one type to another.

Researchers infected four cloned armadillos with the bacteria, and observed the growth of their livers. The bacteria enlarged the liver, basically give themselves more room – and this was accomplished in a way that left the livers perfectly functional and healthy.

The researchers suggest that, as with other body tissues, the bacteria-induced partial reprogramming also works in adult liver in vivo, turning hepatocytes into liver progenitor-like cells leading to proliferation and subsequent re-differentiation in the microenvironment created by the bacteria.

Prof Anura Rambukkana, from the University of Edinburgh’s centre for regenerative medicine described the discover as “completely unexpected”.

“It is kind of mind-blowing,” Prof Rambukkana told the BBC. “How do they do that? There is no cell therapy that can do that.”

HIV Uses Immune Response as a Way to Hide

HIV Infecting a T9 Cell. Credit: NIH

An immune response that likely evolved to help fight infections appears to be the mechanism that drives human immunodeficiency virus (HIV) into a latent state, lurking in cells only to erupt anew, according to research published in the journal Nature Microbiology. The findings help explain why HIV particularly stealthy, but could also apply to other viral infections.

“HIV has proven to be incurable because of a small number of latently HIV-infected T-cells that are untouched by both antiviral drugs and the immune response,” said senior author Bryan R. Cullen, PhD, professor at Duke University School of Medicine.

“These cells, which are very long lived, can spontaneously emerge from latency and start producing HIV even years after infection, thus necessitating the life-long use of antiretrovirals,” Cullen said. “The origin of these latently infected cells has remained unknown despite considerable effort.”

The findings offer important insights, pointing to a protein complex called SMC5/6, which is involved in a host cell’s chromosome function and repair.

HIV enters the body, infects the immune system’s CD4+ T-cells, then makes a genome-length DNA molecule that it integrates into a host cell chromosome where it is then copied to generate viral RNAs and proteins.

If this so-called DNA provirus is prevented from integrating into the host cell DNA, for example by a drug that blocks this process, then it fails to make any viral RNAs and proteins and becomes inert. In contrast, DNA proviruses that are able to integrate are normally able to drive a productive HIV infection.

Cullen and his team found that, in a small number of infected cells, the SMC5/6 protein complex initiates a process that silences the DNA provirus before it integrates into a host cell chromosome. These proviruses remain inert even after integration and result in latent infections, lying low until prompted to erupt into an active infection.

“Our research suggests that latency results not from any intrinsic properties of the infecting HIV but rather from an unfortunate side effect of a cellular innate immune response that probably evolved to silence invasive foreign DNA,” Cullen said.

The researchers found that a molecule that shuts down SMC5/6’s silencing action showed promising results as a potential therapeutic strategy as it inhibited the establishment of latent HIV infections. Reactivated proviruses are vulnerable to natural immune system responses and anti-retroviral drugs.

“Although antiretroviral therapies can reduce the viral load in AIDS patients to below the level of detection, these drugs fail to eradicate HIV-1,” Cullen said. “While there has been considerable effort expended on trying to develop therapies that can activate latent HIV-1 and help antiretroviral therapies clear the body of infectious virus, this effort has so far failed to identify drugs that are both effective and non-toxic. Our study represents a potentially important step toward achieving this goal.”

“Clearly, understanding the mechanism that results in HIV-1 latency may provide insights into how latent HIV-1 proviruses can be reactivated and then destroyed,” Cullen said.

Source: Duke University Medical Center

Bariatric Surgery Slashes Risk of Cardiovascular Events

Obesity
Image source: Pixabay CC0

A study of obese adults with nonalcoholic fatty liver disease (NAFLD) and morbid obesity has shown that those who underwent bariatric surgery suffered far fewer extreme cardiovascular events subsequently.

Reporting their results in JAMA Network Open, the researchers, reported that these obese patients (BMI > 40) undergoing bariatric surgery had a 49% lower risk of developing adverse cardiovascular events.

“The findings provide evidence in support of bariatric surgery as an effective therapeutic tool to lower elevated risk of cardiovascular disease for select individuals with obesity and NAFLD,” said Vinod K. Rustgi, profesor at Rutgers Robert Wood Johnson Medical School. “These finding are tremendously impactful for many reasons.”

NAFLD, and a more advanced form known as NASH, are rapidly increasing causes of liver disease which occur because of excessive fat storage in the liver. As such it is common in obesity and type 2 diabetes.

In the study, researchers analysed outcomes data, using a medical insurance database, from 2007 to 2017. Of 230 million covered individuals, 86 964 adults between the ages of 18 and 64 who had obesity and NAFLD were identified. Of those, 68% were female, 35% underwent bariatric surgery and 65% received nonsurgical care.

Bariatric surgery patients experienced a 49% decrease in the risk of developing major cardiovascular events such as heart attacks, heart failure or ischemic strokes. They were also far less likely to experience angina, atherosclerotic events or arterial blood clots.

The association between bariatric surgery and risk reduction of developing cardiovascular disease has not been studied to this level of detail before, the researchers said.

There is growing evidence that bariatric surgery, because of the weight reduction it brings about in patients, offers definitive health benefits. A study conducted by Rustgi and colleagues, published in the journal Gastroenterology in March 2021, showed that bariatric surgery can also significantly reduce the risk of cancer, especially obesity-related, in obese individuals with NAFLD. Importantly, these cancers included colorectal, pancreatic, endometrial, thyroid cancer, multiple myeloma and hepatocellular carcinoma.

“Although bariatric surgery is a more aggressive approach than lifestyle modifications, it may be associated with other benefits, such as improved quality of life and decreased long-term health care burden,” Rustgi said.

Source: Rutgers University

Phthalates in Everyday Products do Cause Uterine Fibroids

Photo by Sora Shimazaki on Pexels

For the first time, a study published Proceedings of the National Academy of Sciences (PNAS) has shown a causal association between environmental phthalates and the increased growth of uterine fibroids, the most common tumours among women.

Manufacturers use environmental phthalates in numerous industrial and consumer products, and they’ve also been detected in medical supplies and food. Although they are known to be toxic, they are currently unbanned in the US.

“These toxic pollutants are everywhere, including food packaging, hair and makeup products, and more, and their usage is not banned,” said corresponding study author Dr Serdar Bulun at Northwestern University. “These are more than simply environmental pollutants. They can cause specific harm to human tissues.”

Up to 80% of all women may develop a fibroid tumour during their lifetime, Bulun said. One-quarter of these women become symptomatic with excessive and uncontrolled uterine bleeding, anaemia, miscarriages, infertility and large abdominal tumours necessitating technically difficult surgeries.

The new study found women with a high exposure to certain phthalates such as DEHP (used as a plasticiser to increase the durability of products such as shower curtains, car upholstery, lunchboxes, shoes and more) and its metabolites have a high risk for having a symptomatic fibroid.

Prior epidemiological studies have consistently indicated an association between phthalate exposure and uterine fibroid growth, but this study explains the mechanisms behind that link. The scientists discovered exposure to DEHP may activate a hormonal pathway that activates an environmentally responsive receptor (AHR) to bind to DNA and cause increased growth of fibroid tumors.

“Interestingly, AHR was cloned in the early ’90s as the receptor for dioxin, the key toxin in the agent orange,” Bulun said. “The use of agent orange during the Vietnam war caused significant reproductive abnormalities in the exposed populations; and dioxin and AHR were thought to be responsible for this.”

This new study, Bulun said, provides further evidence to support these theories.

Source: Northwestern University