Neurofeedback Could Help Alleviate ‘Chemo Brain’

Using neurofeedback to restore normal functioning in the brains of cancer patients could potentially alleviate the mental fogginess that many report after treatment, according to a new pilot study published in the Journal of Complementary and Integrative Medicine.

The UCLA study is one of the first to indicate that neurofeedback, or electroencephalogram (EEG) biofeedback, could help address cognitive deficits of cancer patients experiencing ‘chemo brain’, a constellation of symptoms that could include problems with memory, concentration and organisation, as well as other symptoms like trouble sleeping and emotional difficulties. In previous studies, neurofeedback, in which brain waves are trained to operate in optimal frequency patterns, has been shown to help improve cognitive function in patients with cognitive impairments like attention-deficit/hyperactivity disorder, stroke and seizures, as well as helped regulate brain activity in patients with substance use and post-traumatic stress disorders.

“The history of neurofeedback shows that it’s helpful for a whole range of disorders and symptoms. This study was an opportunity for seeing whether neurofeedback is something that could be helpful with chemo brain,” said study leader Stephen Sideroff, a UCLA professor who has used neurofeedback training with patients for over 20 years.

The study by Sideroff and UCLA colleagues David Wellisch and Valerie Yarema included nine female breast cancer patients between the ages of 21 and 65 who had completed chemotherapy at least one year earlier and complained of debilitating symptoms of chemo brain, which brought significant disruptions to their work and personal lives. A clinical nurse practitioner conducted a brief mental status interview with each patient to confirm that they had persistent difficulties with concentration, memory, organization and confusion. The patients selected for the study did not have a current breast cancer diagnosis, a present or recent diagnosis of a major depressive disorder or other mental illness, or used cognitive-altering medications that might confound study results.

Before the neurofeedback training sessions began, the study participants received neurocognitive and psychological tests, as well as a quantitative EEG to measure brain wave frequencies that could be compared to normative data. The pre-training quantitative EEGs shows that each study participant had abnormal brain waive activity compared to healthy adult brains.

The study participants received a series of 18 neurofeedback sessions, scheduled for 30 minutes each over a six-week period. During these sessions, sensors were placed on the scalp and earlobe to monitor brain wave frequencies. Patients were shown a monitor displaying these frequencies in bar graphs, and they were told their goal was to increase or decrease the amplitude of specific frequency ranges to turn each bar green. They received audio and visual feedback when they successfully shifted these amplitudes.

Quantitative EEGs taken after completion of the 18 neurofeedback sessions found that brain wave frequencies had significantly normalised in seven of the nine study participants, and significantly improved in the other two.

Neurocognitive tests also showed substantial improvements in the study participants’ information processing, executive set shifting and sustained visual attention. There was improvement for all in everyday functioning and overall psychological.

Study limitations include a small sample size and lack of a control group, and some were unable to complete the study in the allotted period.

“Our results are more impressive given we were not able to have subjects stick to the schedule,” Prof Sideroff said.

Prof Sideroff said the study results were strong enough to support further research into whether neurofeedback is an effective approach for addressing chemo brain and determining the ideal protocols for conducting neurofeedback training sessions.

Source: University of California – Los Angeles

Increased Risk of Hip Fractures for Women on Vegetarian Diets

Source: Unsplash

Middle-aged women on vegetarian diets have a significantly higher risk of hip fractures than those on diets that include fish or meat, according to a long-term study publish in BMC Central. This risk remained even after accounting for the differences in available nutrient intake and body mass index.

Hip fractures greatly impact quality of life and and health outcomes, and carry a significant financial burden, with an average of $44 000 estimated to spent in the 12 months following a hip fracture. The growing trends of meat-free diets have prompted concern over their impact on hip fracture rates.

While increased intake of vegetable proteins has been associated with lower hip fracture risk, vegetarian diets have also been characterised by lower dietary intakes of nutrients that boost bone mineral density (BMD) and which are more abundant in animal products. Examples include total protein, calcium, vitamin D, vitamin B12, and ω-3 fatty acids, though the relationship with BMD is complex.

The researchers drew on data from the United Kingdom Women’s Cohort Study (UKWCS), and included 26 318 participants aged 35–69 who were classed into regular meat-eaters (> 5 servings/week), occasional meat-eaters (< 5 servings/week), pescatarians (eating fish but no meat) and vegetarians.

On average, vegetarians and pescetarians had a lower BMI (23.3 for both) than regular meat-eaters (25.2). At recruitment, regular meat-eaters had the highest prevalence of CVD, cancer, or diabetes (10.2%), and vegetarians the lowest (5.8%). A higher proportion of vegetarians reported never drinking alcohol. Regular meat-eaters reported the highest absolute dietary intakes of protein, vitamin D, and vitamin B12, whilst vegetarians reported the lowest. Calcium intakes were similar across the diet groups.

Before adjustments, compared with regular meat-eaters, vegetarians (hazard ratio 1.40) but not occasional meat-eaters (1.03) or pescatarians (1.04) had a greater hip fracture risk. Adjustment for confounders slightly attenuated these associations in the adjusted model, but the higher risk in vegetarians remained and was statistically significant: vegetarians 1.33; occasional meat-eaters 1.00; pescatarians 0.97.

However, even after adjustment for factors such as reported differences in nutrient intake and lower BMI, which is a known risk factor in hip fractures, the relative risk difference remained. This suggests that other, as yet unknown, factors related to the diets may be involved.

Study Gives Water Fluoridation a Green Thumbs Up

Teeth and toothbrush
Photo by Diana Polekhina on Unsplash

For the first time, researchers have demonstrated the low environmental footprint of water fluoridation compared to other preventive measures for tooth decay while still retaining effectiveness. The study is published in the British Dental Journal.

Water fluoridation is regarded as one of the most significant public health interventions of the twentieth century. But as the climate crisis worsens, the contribution of healthcare and the prevention of disease to the crisis must be considered.

Influenced by this urgency, Trinity College Dublin researchers collaborating with University College London quantified the environmental impact of water fluoridation for an individual five year-old child over a one-year period and compared this to the traditional use of fluoride varnish and toothbrushing programmes, which take place in selected schools across the UK, and internationally.

Over 35% of the world’s population has access to water fluoridation, with studies showing significant reductions in dental caries. Whilst data on the clinical effectiveness and cost analysis of water fluoridation are available, there has been no data regarding its environmental impact up to now.

To quantify this impact, the research team performed a Life Cycle Assessment by carefully measuring the combined travel, the weight and amounts of all products and the processes involved in all three preventive programmes (toothbrushing, fluoride varnish programmes and water fluoridation). Data was inputted into specialised environmental software and the team used the Ecoinvent database, enabling them to calculate environmental outputs, including the carbon footprint, the amount of water used for each product and the amount of land use.

The results of the study, led by Brett Duane, Associate Professor in Dental Public Health at Trinity College, concluded that water fluoridation had the lowest environmental impact in all categories studied, and had the lowest disability-adjusted life years impact when compared to all other community-level caries prevention programmes. The study also found that water fluoridation gives the greatest return on investment.

Considering the balance between clinical effectiveness, cost effectiveness and environmental sustainability, researchers believe that water fluoridation should be the preventive intervention of choice.

This research strengthens the case internationally for water fluoridation programmes to reduce dental decay, especially in the most vulnerable populations.

Assoc Prof Duane said: “As the climate crisis starts to worsen, we need to find ways of preventing disease to reduce the environmental impact of our health systems. This research clearly demonstrates the low carbon impact of water fluoridation as an effective prevention tool.”

Source: Trinity College Dublin

Returning to Sport after COVID Infection

Rugby players
Photo by Olga Guryanova

A first-of-its-kind study published in Scientific Reports has investigated how the immune system of elite student-athletes responded to the COVID virus. Unlike older adults with comorbidities, American Football players who were diagnosed with COVID were able to have their immune system back to its baseline after their CDC-recommended isolation period.

“When COVID really started moving out of control, we met with Neil Johannsen, an exercise physiologist at LSU, and the athletic trainers Derek Calvert and Jack Marucci, and we discussed what we could do to make sure our athletes remained healthy. We especially wanted to make sure that athletes were not at risk for secondary infections when they came back from isolation,” said Guillaume Spielmann, associate professor in LSU’s School of Kinesiology.

Isolation effective after COVID infection

“When the idea started for the research, we discussed why not turn something negative into a positive, and assist with the research to find some answers. If we can do things to understand the virus better, let’s do it,” said Jack Marucci, LSU’s Director of Athletic Training. “The student-athletes were willing to be a part of it.”

During that time at the start of the COVID pandemic, the CDC had recommended 14 days of isolation.

“There was a lot unknown during this time. We are looking at a population that are extremely close to each other during plays and during games. We wanted to make sure that since they are literally face-to-face with other players, that their salivary defences, their oral defences were pretty much intact and that that part of their immune system was not affected by the disease; that there were no long-lasting effects of the disease,” Assoc Proff Spielmann said.

Saliva samples were collected from 29 student-athletes in 2020, before a COVID vaccine. Fourteen were COVID positive and 15 had no history of infection. Of the 14, only six reported mild symptoms from the virus, the other eight were asymptomatic throughout the isolation period.

“Salivary immunity is extremely important to ensure that people don’t contract secondary infections, so when athletes are coming back we need to make sure they are as healthy as can be. We found that the isolation period was sufficient to restore the athletes’ salivary immunity to the level seen in non-infected players,” Assoc Prof Spielmann said.

Safely return to play after COVID

These findings suggested the student-athletes could safely return to practice and play football without a risk of secondary infection; that their immune system wasn’t at risk when playing the close contact sport.

“I was worried a bit about long-haulers and other more significant outcomes like the concerns for the development of myocarditis. Engaging in athletic activities at an elite level can be stressful on the body and you would want to arm yourselves with the best scientific information to help understand potential outcomes. This data helped to validate some of these decisions that were made. Providing a safe environment for your student-athletes is paramount and this helped that process along,” said Shelly Mullenix, LSU’s Senior Associate Athletics Director for Health & Wellness.

For this study, three graduate students also participated in the research.

“This kind of access is unique in Division I sports. You typically don’t have access to football players, so the fact that we have access is hugely instrumental as well,” Assoc Prof Spielmann said. “LSU is a great place for this field.”

“I think this COVID research is something that we are really proud to be a part of and contribute to finding answers to such a devastating virus,” Marucci said.

Assoc Prof Spielmann, an immunologist, researches the impact of stress on the immune system of elite and tactical athletes, including astronauts and fire fighters. But this study isn’t the first for Spielmann and LSU Athletics. They have worked together to study psychological and physiological health, along with performance measures in other student-athletes and sports teams. A new study will take a closer look at female athletes’ mental, physiological and immune resilience to stress.

Source: Louisiana State University

A New Guideline for Pollen Food Syndrome

Photo by Daria Shevtsova on Pexels

Pollen Food Syndrome (PFS) – also known as oral allergy syndrome or pollen food allergy syndrome – causes affected individuals to experience an allergic reaction when consuming raw plant foods, and triggers can vary depending on an individual’s pollen sensitisation, which in turn is affected by geographical location. A guideline, published in Clinical & Experimental Allergy, has been developed for the diagnosis and management of PFS.

The guideline was drafted by the British Society of Allergy & Clinical Immunology Standards of Care Committee. The correct diagnosis of PFS ensures the avoidance of a misdiagnosis of a primary peanut or tree nut allergy or confusion with another plant food allergy to non-specific lipid transfer proteins. The characteristic foods involved, and rapid-onset oropharyngeal symptoms, mean PFS can often be diagnosed from the clinical history alone. Management focuses on avoiding known trigger foods, which may appear to be simple, but can be difficult if coupled with a pre-existing food allergy, or for individuals following a vegetarian/vegan diet.

“More studies on the effect of PFS on health-related quality of life are needed to dispel the myth that because it usually manifests with mild symptoms, PFS is easily managed, and does not adversely affect the individual,” the authors wrote. “The number of foods and concern about new food triggers means dietary restrictions are often overly strict, so more research on novel treatments of PFS, including food immunotherapy, needs to be undertaken.”

Source: Wiley

A Simple Low-cost Method to Identify Aortic Valve Stenosis

Source: CC0

In the Journal of Applied Physics, researchers developed a method to identify aortic valve stenosis using complex network analysis that is accurate, simple to use, and low-cost.

Aortic valve stenosis occurs when the aortic valve narrows, constricting blood flow from the heart through the artery and to the entire body. In severe cases, it can lead to heart failure. Identifying the condition can be difficult in remote areas because it requires sophisticated technology, and diagnoses at early stages are challenging to obtain.

“Many rural health centres don’t have the necessary technology for analysing diseases like this,” said author M.S. Swapna, of the University of Nova Gorica and the University of Kerala. “For our technique, we just need a stethoscope and a computer.”

The diagnostic tool works based on the sounds produced by the heart. The organ creates a “lub” noise as it closes the mitral and tricuspid valves, pauses as ventricular relaxation occurs and the blood fills in, then makes a second noise, “dub,” as the aortic and pulmonary valves close.

Swapna and her team used heart sound data, collected over 10 minutes, to form a graph. This was then split into sections, with each part representing with a node on the graph. If the sound in that portion of the data was similar to another section, a line was drawn between the two nodes.

In a healthy heart, the graph showed two distinct clusters of points, with many nodes unconnected. In contrast, a heart with aortic stenosis contained many more correlations and edges.

“In the case of aortic stenosis, there is no separation between the ‘lub’ and ‘dub’ sound signals,” explained Swapna.

The researchers used machine learning to examine the graphs and identify those with and without disease, achieving a classification accuracy of 100%. Their method takes the correlation of each point under consideration, making it more accurate than others that only consider the strength of the signal, and it does so in less than 10 minutes. As such, it could be useful for early-stage diagnoses.

So far, the method has only been tested with data, not in a clinical setting. The authors are developing a mobile application that could be accessed worldwide. Their technique could also be used to diagnose other conditions.

“The proposed method can be extended to any type of heart sound signals, lung sound signals, or cough sound signals,” said Swapna.

Source: American Institute of Physics

Scientists Test A Potential New Therapy for Preeclampsia

Pregnant with ultrasound image
Source: Pixabay

Researchers have proposed a new therapy for preeclampsia that corrects the defects identified in placental cells, and restores placental and foetal weight, which they report in the journal Redox Biology. The treatment, tested in two rodent models, successfully lowers blood pressure in the mother and resolves the characteristic preeclampsia symptoms of proteinuria and cardiovascular abnormalities.

Preeclampsia is a placental dysfunction that affects approximately 2 to 8% of pregnant women worldwide. It can have potentially complications for mother and child, and longer-term consequences for the mother. Preeclampsia symptoms are primarily arterial hypertension, proteinuria, abnormal coagulation in the placenta, cardiovascular abnormalities in the mother and foetal growth restriction. Treatments for preeclampsia are limited and mostly involve aspirin as a preventative measure, reducing the procoagulant state in the placenta and partly relieving pressure on the vascular network.

Preeclampsia is characterised by a defective placenta caused by trophoblast dysfunction. Trophoblasts are placental cells that help organise and manage the vascular network which provides the essential resources for foetal growth. At the molecular level, preeclampsia is characterised by an uncontrolled increase in oxidative stress, with excessive production of various reactive species including reactive oxygen and nitrogen species. There is a genetic component: the first gene to be identified as being implicated in the genetic forms of preeclampsia was the STOX1 transcription factor, which controls the expression of thousands of genes, especially those involved in the production of nitric oxide (NO).

In a transgenic mouse model, high accumulation of STOX1 in the placenta induced a preeclampsia-like syndrome. In preeclampsia, NO, a powerful vasodilator that promotes blood flow to the placenta, is mobilised to produce potentially toxic molecules (nitrosative stress) and its levels become insufficient in the placental vascular network, affecting trophoblast function and the vascular network and destabilising other reactive species. This creates a vicious circle and causes uncontrollable oxidative/nitrosative stress with multiple complications, also affecting maternal blood vessel cells, with potentially fatal consequences.

NO is produced by a family of enzymes known as nitric oxide synthases (NOSs). Finding a way of restoring NO production in the placenta via NOSs could represent an effective new therapy to treat preeclampsia. A years-long collaboration gave rise to a potential solution. The scientists’ research was based on trophoblasts overexpressing STOX1 and on two rodent models of preeclampsia, one mimicking early-onset forms via placental overexpression of STOX1 and the other mimicking late-onset forms by partial occlusion of the lower abdominal aorta.

The research revealed a cascade of events that ultimately led the scientists to propose a new therapy. Treating trophoblasts with BH4 (tetrahydrobiopterin, a cofactor that stabilises the NOS enzyme producing NO) corrected the defects identified in these cells, restoring production of NO rather than potentially toxic molecules. More importantly, administering BH4 to the two preclinical rodent models restored placental and foetal weight. Finally, in the early-onset STOX1 preclinical model with significant arterial hypertension and proteinuria, the BH4 treatment corrected blood pressure, excess protein in urine, and cardiovascular abnormalities in the mother. The results even suggest that the treatment may be effective in addressing the long-term effects of preeclampsia on mothers (vascular abnormalities in the brain, kidneys, heart and liver).

This research is the first step towards the development of a therapy for preeclampsia. Genetic analyses of placentas treated with BH4 showed that it corrects the expression of several genes disrupted by excess STOX1 differently than the deregulation induced by aspirin in the placenta. The scientists therefore propose that a treatment combining BH4 and aspirin could be the ultimate therapeutic solution for many cases of preeclampsia. This hypothesis needs to be validated in clinical trials.

Source: Institut Pasteur

Array of Autoimmune Disorders Linked to Cardiovascular Disease

Source: Wikimedia Commons CC0

A new epidemiological study published in The Lancet shows that patients with autoimmune disease have a substantially higher risk (between 1.4 and 3.6 times depending on which autoimmune condition) of developing cardiovascular disease (CVD) than people without an autoimmune disorder. This excess risk is comparable to that of type 2 diabetes, a well-known risk factor for cardiovascular disease.

Although earlier research has suggested associations between various different autoimmune disorders and a higher risk of cardiovascular disease, these studies were often too small and limited to selected autoimmune or selected cardiovascular conditions to draw conclusive evidence on the necessity of CVD prevention among patients with autoimmune disease.

At the annual congress of the European Society of Cardiology, researchers presented the outcome of a thorough epidemiological investigation into possible links between 19 of the most common autoimmune disorders and CVD. The research shows for the first time that cardiovascular risks affect autoimmune disease as a group of disorders, rather than selected disorders individually.

The whole cardiovascular disease spectrum

In the study, the authors show that the group of 19 autoimmune disorders they have studied accounts for about 6% of cardiovascular events. Importantly, excess cardiovascular risk was visible across the whole cardiovascular disease spectrum, beyond classical coronary heart disease, including infection-related heart disorders, heart inflammation, as well as thromboembolic and degenerative heart disorders, suggesting the implications of autoimmunity on cardiovascular health are likely to be much broader than originally thought. Furthermore, the excess risk was not explained by traditional cardiovascular risk factors such as age, sex or smoking. Another noteworthy finding: the excess risk is particularly high among patients with autoimmune disorders under 55 years and suggests that autoimmune disease is particularly important in causing premature cardiovascular disease, with the potential to result in a disproportionate loss of life years and disability.

The study was based on UK electronic health with data from about one-fifth of the current UK population. The researchers assembled a cohort of patients newly diagnosed with any of the nineteen autoimmune disorders. They then looked at the incidence of twelve cardiovascular outcomes – an unprecedented granularity that was made possible by the very large size of the dataset – in the following years, and they compared it to a matched control group. The risk of developing CVD for patients with one or more autoimmune disorders was on average 1.56 times higher than in those without autoimmune disease. The excess risk also rose with the number of different autoimmune disorders in individual patients. Among the disorders with the highest excess risk were systemic sclerosis, Addison’s disease, lupus and type I diabetes.

Need for targeted prevention measures

The results show that action is needed, said Nathalie Conrad, lead author of the study. “We see that the excess risk is comparable to that of type 2 diabetes. But although we have specific measures targeted at diabetes patients to lower their risk of developing cardiovascular disease (in terms of prevention and follow-up), we don’t have any similar measures for patients with autoimmune disorders.” Conrad also noted that the European Society of Cardiology guidelines on the prevention of cardiovascular diseases, do not yet mention autoimmunity as a cardiovascular risk factor, only mentioning specific disorders such as lupus, nor do they list any specific prevention measures for patients with autoimmune disease.

Conrad hopes the study will raise awareness among patients with autoimmune disease and clinicians involved in the care of these patients, which will include many different specialties such as cardiologists, rheumatologists, or general practitioners. ‘We need to develop targeted prevention measures for these patients. And we need to do further research that helps us understand why patients with an autoimmune disorder develop more cardiovascular diseases than others, and how we can prevent this from happening.’

The underlying mechanisms are still poorly understood. Conrad said: “The general hypothesis is that chronic and systemic inflammation, which is a common denominator in autoimmune disorders, can trigger all sorts of cardiovascular disease. Effects of autoimmune disease on connective tissues, small vessels, and cardiomyocytes, and possibly some of the treatments commonly used to treat autoimmunity are also likely to contribute to patients’ cardiovascular risk. This really needs to be investigated thoroughly.”

Source: KU Leuven

Early Diabetes, Hypertension Accelerates Development of Glaucoma

Credit: National Eye Institute

Developing Type 2 diabetes or hypertension earlier in life is linked to the earlier development of the most common form of glaucoma, according to a study published in Clinical Ophthalmology. These findings could lead to better screening protocols for primary open-angle glaucoma (POAG), the leading cause of irreversible blindness worldwide, which makes up nearly 90% of all cases of glaucoma.

“Currently, we lack the tools to cure glaucoma, but with enough advanced notice, we can preserve patients’ vision. Early detection of glaucoma is the key to better control of intraocular pressure and preventing blindness,” said study leader Karanjit Kooner, MD, PhD, MBA, Associate Professor of Ophthalmology at UTSW.

Tens of millions of people have POAG around the globe. Because this disease has few symptoms in its earliest stages, Dr. Kooner explained, patients are frequently diagnosed in its later stages when vision has already been permanently damaged. Although researchers have identified several risk factors for POAG – including Type 2 diabetes, hypertension, migraines, and obstructive sleep apnoea — how they might influence the onset of POAG is not well understood.

To answer this question, Dr Kooner and his colleagues collected data from the medical records of 389 of his patients with POAG. The researchers found no link between migraines and/or obstructive sleep apnoea and the age of POAG onset. However, the researchers found that the age of Type 2 diabetes and/or hypertension diagnosis was significantly linked with the onset of POAG – the earlier patients presented with either or both of these conditions, the earlier they tended to develop POAG.

Dr Kooner noted that both Type 2 diabetes and hypertension are diseases that affect blood vessels of both the optic nerve and retina, thus potentially causing changes that predispose patients to POAG, another condition with a vascular root. If these connections hold up in future research, he said, Type 2 diabetes and hypertension could be added to the list of factors that can trigger POAG screening — including a family history of POAG, elevated intraocular pressure, and Black race — and lead to earlier diagnosis of POAG, preserving patients’ vision and quality of life.

Source: UT Southwestern Medical Center

Toss Out Hospital Sinks Colonised by MDRO, Evidence Suggests

Methicillin-resistant Staphylococcus aureus (MRSA) bacteria. Credit: CDC

An outbreak of a pandrug-resistant nosocomial pathogen was interrupted by not using hospital sinks during COVID, according to Basma Mnif, Professor of Microbiology at Habib Bourguiba University Hospital of Sfax, Tunisia. In her presentation at the 14th SAFHE Southern African Healthcare Conference, she said that infection control methods to eradicate the pathogen failed and that other research indicated it was necessary to replace the sinks entirely.

Multidrug-resistant organisms (MDRO) are a growing threat in hospitals, especially to critically ill patients.

Over 2017 to 2021, 90 critically ICU patients in a Tunisian hospital were infected with pandrug-resistant Proteus mirabilis strains. This is the first known long-term outbreak by pandrug-resistant P. mirabilis strains.

P. mirabilis is an uncommon nosocomial pathogen causing opportunistic infections. P. mirabilis survives well in the natural environment and is increasingly implicated in nosocomial outbreaks worldwide.

The all-cause mortality rate in the infected was 47%, with patients ranging in age from 16 to 78 years. The average length of stay before infection was 23.56 days.

An outbreak was recognised in April 2017, and IDC measures were taken to contain it. The outbreak was suppressed but reoccurred in July and December. Analysis revealed overlapping ICU stays of infected patients, suggesting horizontal, intra-ICU transmission. Lab analysis of phenotypes revealed two clones, A and B, both with drug resistance genes, to which a third clone was added in 2018. This Clone C proved to have resistance to all known antibiotics.

During the COVID pandemic in 2020, hospital sinks were not used and enhanced infection prevention interventions were deployed. This period coincided with a complete absence of P. mirabilis infections. The outbreak resumed in 2021, with the same three clones causing infections in patients.

“The outbreak intermission during COVID could be related to the enhanced protection measures implemented during this period,” Prof Mnif noted, “but we think that the sinks are in fact the reservoirs of these MDRO, and must in fact be removed and replaced, and the chemical disinfection that we had performed was not sufficient to control the outbreak.”

The outbreak highlighted the need for proper infection control protocols. Hospital wastewater is a major source of outbreaks, Prof Mnif pointed out. A study found that “over the past 20 years, there have been 32 reports of carbapenem-resistant organisms in the hospital water environment.”

She said when it came to replacing the sinks, hospitals should “respect FGI guidelines, especially in having sufficient depths of the sink, deep enough to prevent splashing.” Having sufficient pressure and splash reduction measures such as splash guards are also important, Prof Mnif added.

Although there are CDC guidelines to help prevent colonisation, there is no clear strategy for eradication for when a sink is colonised. There is likely genetic interchange between organisms in biofilms, something which needs to be investigated further, as well as means of eradication.