An estimated quarter of adults and two-thirds of children have strong fears around needles, according to the US Centers for Disease Control and Prevention. Yet, public health depends on the willingness of people receive vaccines, typically administered by a subcutaneous injection.
Darcy Dunn-Lawless, a doctoral student at the University of Oxford’s Institute of Biomedical Engineering, is investigating the potential of a painless, needle-free vaccine delivery by ultrasound.
He will share the recent advancements in this promising technique as part of Acoustics 2023 Sydney, running Dec. 4-8 at the International Convention Centre Sydney.
“Our method relies on an acoustic effect called ‘cavitation,’ which is the formation and popping of bubbles in response to a sound wave,” said Dunn-Lawless.
“We aim to harness the concentrated bursts of mechanical energy produced by these bubble collapses in three main ways. First, to clear passages through the outer layer of dead skin cells and allow vaccine molecules to pass through. Second, to act as a pump that drives the drug molecules into these passages. Lastly, to open up the membranes surrounding the cells themselves, since some types of vaccine must get inside a cell to function.”
Though initial in vivo tests reported 700 times fewer vaccine molecules were delivered by the cavitation approach compared to conventional injection, the cavitation approach produced a higher immune response.
The researchers theorize this could be due to the immune-rich skin the ultrasonic delivery targets in contrast to the muscles that receive the jab.
The result is a more efficient vaccine that could help reduce costs and increase efficacy with little risk of side effects.
“In my opinion, the main potential side effect is universal to all physical techniques in medicine: If you apply too much energy to the body, you can damage tissue,” Dunn-Lawless said.
“Exposure to excessive cavitation can cause mechanical damage to cells and structures. However, there is good evidence that such damage can be avoided by limiting exposure, so a key part of my research is to try and fully identify where this safety threshold lies for vaccine delivery.”
Dunn-Lawless works as part of a larger team under the supervision of Dr Mike Gray, Professor Bob Carlisle, and Professor Constantin Coussios within Oxford’s Biomedical Ultrasonics, Biotherapy and Biopharmaceuticals Laboratory (BUBBL). Their cavitation approach may be particularly conducing to DNA vaccines that are currently difficult to deliver. With cavitation able to help crack open the membranes blocking therapeutic access to the cell nucleus, the other advantages of DNA vaccines, like a focused immune response, low infection risk, and shelf stability, can be better utilised.
Air filtration systems do not reduce the risk of picking up viral infections, according to new research from the University of East Anglia. A new study published in Preventive Medicine reveals that technologies designed to make social interactions safer in indoor spaces are not effective in the real world. The team studied technologies including air filtration, germicidal lights and ionisers.
They looked at all the available evidence but found little to support hopes that these technologies can make air safe from respiratory or gastrointestinal infections.
Prof Paul Hunter said: “Air cleaners are designed to filter pollutants or contaminants out of the air that passes through them.
“When the Covid pandemic hit, many large companies and governments – including the NHS, the British military, and New York City and regional German governments – investigated installing this type of technology in a bid to reduce airborne virus particles in buildings and small spaces.
“But air treatment technologies can be expensive. So it’s reasonable to weigh up the benefits against costs, and to understand the current capabilities of such technologies.”
The research team studied evidence about whether air cleaning technologies make people safe from catching airborne respiratory or gastrointestinal infections. They analysed evidence about microbial infections or symptoms in people exposed or not to air treatment technologies in 32 studies, all conducted in real world settings like schools or care homes. So far none of the studies of air treatment started during the Covid era have been published.
Lead researcher Dr Julii Brainard said: “The kinds of technologies that we considered included filtration, germicidal lights, ionisers and any other way of safely removing viruses or deactivating them in breathable air.
“In short, we found no strong evidence that air treatment technologies are likely to protect people in real world settings.
“There is a lot of existing evidence that environmental and surface contamination can be reduced by several air treatment strategies, especially germicidal lights and high efficiency particulate air filtration (HEPA). But the combined evidence was that these technologies don’t stop or reduce illness.
“There was some weak evidence that the air treatment methods reduced likelihood of infection, but this evidence seems biased and imbalanced. We strongly suspect that there were some relevant studies with very minor or no effect but these were never published.
“Our findings are disappointing – but it is vital that public health decision makers have a full picture. Hopefully those studies that have been done during Covid will be published soon and we can make a more informed judgement about what the value of air treatment may have been during the pandemic.”
Cancer treatment is growing more complex, but so too are the possibilities. After all, the better a tumour’s biology and genetic features are understood, the more treatment approaches there are. To be able to offer patients personalised therapies tailored to their disease, laborious and time-consuming analysis and interpretation of various data is required. In one of many artificial intelligence (AI)projects at Charité – Universitätsmedizin Berlin and Humboldt-Universität zu Berlin, researchers studied whether generative AI tools such as ChatGPT can help with this step.
The crucial factor in the phenomenon of tumour growth is an imbalance of growth-inducing and growth-inhibiting factors, which can result, for example, from changes in oncogenes.
Precision oncology, a specialised field of personalised medicine, leverages this knowledge by using specific treatments such as low-molecular weight inhibitors and antibodies to target and disable hyperactive oncogenes.
The first step in identifying which genetic mutations are potential targets for treatment is to analyse the genetic makeup of the tumour tissue. The molecular variants of the tumour DNA that are necessary for precision diagnosis and treatment are determined. Then the doctors use this information to craft individual treatment recommendations. In especially complex cases, this requires knowledge from various fields of medicine.
At Charité, this is when the “molecular tumour board” (MTB) meets: Experts from the fields of pathology, molecular pathology, oncology, human genetics, and bioinformatics work together to analyse which treatments seem most promising based on the latest studies.
It is a very involved process, ultimately culminating in a personalised treatment recommendation.
Can artificial intelligence help with treatment decisions?
Dr Damian Rieke, a doctor at Charité, and his colleagues wondered whether AI might be able to help at this juncture.
In a study just recently published in the journal JAMA Network Open, they worked with other researchers to examine the possibilities and limitations of large language models such as ChatGPT in automatically scanning scientific literature with an eye to selecting personalised treatments.
AI ‘not even close’
“We prompted the models to identify personalised treatment options for fictitious cancer patients and then compared the results with the recommendations made by experts,” Rieke explains.
His conclusion: “AI models were able to identify personalised treatment options in principle – but they weren’t even close to the abilities of human experts.”
The team created ten molecular tumour profiles of fictitious patients for the experiment.
A human physician specialist and four large language models were then tasked with identifying a personalised treatment option.
These results were presented to the members of the MTB for assessment, without them knowing where which recommendation came from.
Improved AI models hold promise for future uses
Dr. Manuela Benary, a bioinformatics specialist reported: “There were some surprisingly good treatment options identified by AI in isolated cases. “But large language models perform much worse than human experts.”
Beyond that, data protection, privacy, and reproducibility pose particular challenges in relation to the use of artificial intelligence with real-world patients, she notes.
Still, Rieke is fundamentally optimistic about the potential uses of AI in medicine: “In the study, we also showed that the performance of AI models is continuing to improve as the models advance. This could mean that AI can provide more support for even complex diagnostic and treatment processes in the future – as long as humans are the ones to check the results generated by AI and have the final say about treatment.”
With frequent and long stints at their computers, the average gamer is a sedentary night owl, often compromising on sleep – especially quality sleep – and being exposed to too much blue light. The topic has been explored in University of Cape Town (UCT) PhD candidate Chadley Kemp’s doctoral thesis, a meaty study of over 70 000 words.
Kemp’s research into habitual gaming activities is supervised by Associate Professor Dale Rae, a sleep researcher and senior lecturer at the Health Through Physical Activity, Lifestyle and Sport Research Centre (HPALS) in the Faculty of Health Sciences.
This work is founded on Kemp’s 2018 research underpinning a master’s in medical science at UCT’s former Department of Exercise Science and Sports Medicine in the Sports Science Institute of South Africa. This was upgraded to a PhD in 2020.
His research (he is an esports and video game enthusiast) explores adult esports players’ sleep, health status, light exposure patterns and physical activity.
“We know that sleep affects mental functioning in general, but we weren’t sure about the extent to which this applied to esports players,” said Kemp.
Framework for healthier gameplay
Kemp’s goal is to produce objective data that will guide the development of a framework aimed at promoting healthier gameplay standards and encouraging policy reform within the esports industry.
The tests they used to assess neurocognitive performance were intended to serve as proxies for certain aspects of esports performance because they tested specific mental skills important to gaming, he added.
“We gathered it would be a useful addition to compel gamers to adopt better sleep and lifestyle behaviour changes if it meant … that their health would improve, and they would benefit from better in-game performance – and get an edge over their competitors!”
Kemp’s focus is not on professional gamers, but what he calls “the missing middle” of the esports community: the amateur and semi-competitive gamers.
“This group doesn’t have the same infrastructure and support as their professional counterparts,” he explained. “But what makes them particularly interesting is the fact that they have to balance their gaming commitments with holding down a job, studies, or juggling family or household commitments.”
Esports are burgeoning across the globe – and not only among competitive gamers but audiences too. Writing in the South African Journal of Sports Medicine, Kemp and his co-authors noted that globally competitive gaming attracts 532 million fans alone, according to statistics released in 2022.
However, his study wasn’t motivated by an influx of gamers presenting themselves with sleep difficulties at Associate Professor Rae’s sleep consultancy, Sleep Science. Rather, it stemmed from a broader observation and concern within the local esports community about gamers and poor-quality and short-duration sleep, high levels of sedentarism, and excessive exposure to artificial or electronic night at night.
Based on these conversations and endorsed by anecdotal evidence from within the esports industry, Kemp said he and Rae were able to determine that sleep curtailment had seemingly become a “rite of passage” among gamers. Primarily, most gaming takes place at night because of gamers’ daytime commitments.
As there wasn’t much literature on the topic (much of it is focused on the implications of gaming in children and adolescents) and most studies were survey-based and didn’t target esports players or those regularly engaged with gaming, there was significant knowledge gap that needed filling. As a demographic, Kemp is particularly interested in adult esports players because of the greater health risks posed by age and unhealthy lifestyle factors, such as smoking and alcohol consumption.
Because he needed a tool to measure sleep and physical activity concurrently, he validated the Actiwatch, a special research device, to do this. The device also measures light exposure. For his sample group, Kemp recruited eligible esports players and measured variables of interest. These were clinical measures (anthropometry, blood pressure, blood markers) and self-report data (questionnaires on sleep, chronotype, daytime sleepiness and gaming addiction) and their cognitive performance.
“We also included non-gamers in our study, so we could compare our gamers against people who were not gamers. In total, we had 59 male participants (31 gamers; 28 non-gamers). (The females volunteering to participate did not meet the study’s inclusion criteria.) For a week, these individuals wore the Actiwatch to track their sleep, physical activity, and light exposure.”
The key findings of his research make for interesting reading:
esports players have comparable sleep duration to non-gamers (control group) but tend to sleep later than others. They hit the middle of their sleep cycle around 04:08 compared to 03:01 for the control group.
A much larger percentage of esports players (45.2%) showed night-oriented habits (or evening chronotypes), ie they are more active and alert at night. This is in contrast to only 7.1% of the control group showing similar evening tendencies.
They nap more during the day, but their night sleep duration is similar to that of the control groups.
There was no significant difference in risks related to heart diseases or metabolic diseases between the two groups, which Kemp speculates might be related to their young age. But most of the health markers were tentatively raised, which could point to worse cardiometabolic health in future.
Esports players smoke more.
Esports players performed better in brain-based tasks, showing better attention and accuracy, and making fewer mistakes.
Esports players are less active than the control group. They sit more (11.2 vs 9.1 hours a day) and are less physically active, whether it’s moderate- or vigorous-intensity activity.
Esports players have specific active and inactive hours. They are less active in the early morning and certain evening hours but are more active around midnight.
Esports players are exposed to dimmer light for a more significant part of their day, and their exposure to bright light happens later at night.
This work is important for several reasons, said Kemp. A key takeaway from the research revolves around chronotypes.
“Esports players seem to have sleep patterns that align with being night owls and this may be influenced both by their natural tendencies and their gaming habits. It’s also possible that a genetic disposition and exposure to artificial light from screens collectively contributes to these sleep patterns.
“The combined effect is thought to create a cycle where their preference for evening activities leads to more gaming, which in turn reinforces the night owl tendencies. This impacts on their sleep quality and quantity.”
He added: “Perhaps more obviously, gaming is a massively popular phenomenon that transcends age, sex, and geography. It’s a dominant form of entertainment and its competitive arm, esports, is progressing towards acceptance as a genuine form of sporting competition.”
From the neurocognitive side, it’s clear that gaming can sharpen several cognitive abilities, such as attention and problem-solving.
“However, the catch is, if you’re not getting enough sleep, these enhanced skills could take a hit,” said Kemp. “Gamers might see slower reactions, flawed decision-making, and even a drop in their in-game stamina. So, while gaming certainly has its merits and can even boost certain mental skills, it doesn’t come without health considerations. “
Kemp’s research is aimed at ensuring that anyone engaged with gaming or esports does so in a healthy way.
“The purpose is to create a steppingstone towards health regulation in gaming and esports,” he said. “By creating awareness and providing evidence-based recommendations to prevent chronic health problems caused by unhealthy gaming behaviour, it supports individual decision making, governments, and policy makers. It’s valuable to anyone involved in or impacted by gaming.”
Kemp’s guidelines for gamers:
Get between seven and nine hours’ sleep a night and keep a regular sleep schedule (on weekends too).
Set fixed waking and sleep times to establish a more robust sleep–wake cycle.
For better sleep, ensure your bedroom is dark, quiet, and cool (16-18°C is optimal).
Limit the amount of light exposure in the hours before bedtime (including light from phones, laptops, TVs, etc).
Limit caffeine to the morning and afternoon. This means no energy drinks during those night-time gaming sessions).
A researcher at the University of Kentucky has helped solve a 60-year-old mystery about one of the body’s most vital organs: The heart. Specifically, its tiniest structures: the complicated bundles of filament molecules inside its cells.
Kenneth S. Campbell, PhD, the director of translational research in the Division of Cardiovascular Medicine in the UK College of Medicine, helped map out an important part of the heart on a molecular level. The study was published online in the journal Nature.
Each cardiac cell contains thousands of smaller structures, called sarcomeres – the building blocks of muscle. Within each block, are hundreds of myosin filaments. To put this microscopic level into perspective, if the heart is a continent, Campbell and fellow researchers are looking at single strands of hair.
“Each filament has roughly 2000 molecules arranged in a really complicated structure that scientists have been trying to understand for decades,” said Campbell. “We knew quite a lot about the individual molecules and people thought the myosins could be arranged in groups of six that were called crowns, but not much beyond that.”
Campbell explained the most interesting discovery in the paper is that there are three different types of crowns. The interactions between them are shown in the second photo below.
“We think this means that heart muscle can be controlled more precisely than we had realised. We were also excited to see how myosin binding protein-C, another protein that is linked to genetic heart disease, sits within the structure. It gives us a new level of information about how the molecules are arranged in the heart,” said Campbell.
Working with researchers at the University of Massachusetts Chan Medical School, the group produced single-particle 3D reconstructions of the cardiac thick filaments. The pictures provide a new framework for interpreting structural, physiological and clinical observations.
“We’re interested in therapies for different kinds of heart failure and myopathies, where the heart muscles don’t work very well,” said Campbell. “Our research is one of many projects underway at the university to help come up with better therapies for heart disease.”
The research team collected heart samples from the Gill Cardiovascular Biorepository, of which Campbell is the director. Samples are donated for research purposes from patients who receive cardiovascular care at UK.
“We started the Gill Cardiovascular Biorepository in 2008. With the help of a surgeon at UK HealthCare, we started collecting samples of myocardium from organ donors and from patients who were getting cardiac transplants,” said Campbell. “Now we’ve built a huge resource with roughly 15 000 samples from nearly 500 people.
Snoozing, or using intermittent alarms to get in a few more minutes of sleep in the morning, may have benefits for some people, according to research published in the Journal of Sleep Research.
In a study of 1732 adults who described their waking habits, 69% of participants reported using the snooze function or setting multiple alarms at least “sometimes.” In those who snoozed, the average time spent snoozing per morning was 22 minutes, ranging from 1 to 180 minutes. Snoozers tended to be younger than non-snoozers and were more likely to be evening types. Morning drowsiness and shorter sleep were also more common in those who snoozed.
In a second study of 31 habitual snoozers, 30 minutes of snoozing improved or did not affect performance on cognitive tests directly upon rising compared with waking up abruptly. Snoozing resulted in about 6 minutes of lost sleep, but it prevented awakening from slow-wave sleep. There were no clear effects of snoozing on stress hormone levels, morning sleepiness, mood, or overnight sleep structure.
“The findings indicate that there is no reason to stop snoozing in the morning if you enjoy it, at least not for snooze times around 30 minutes. In fact, it may even help those with morning drowsiness to be slightly more awake once they get up,” said corresponding author Tina Sundelin, PhD, of Stockholm University.
University of Bristol researchers have created a robotic hand that could carry out Clinical Breast Examinations (CBE). The device is able to apply very specific forces over a range similar to forces used by human examiners and can detect lumps using sensor technology at larger depths than before.
This could revolutionise how women monitor their breast health by giving them access to safe electronic CBEs, located in easily accessible places, such as pharmacies and health centres, which provide accurate results. The technology is described in the journal Sensors.
Precision, repeatability and accuracy are of paramount importance in these tactile medical examinations to ensure favourable patient outcomes. A range of automatic and semi-automatic devices have been proposed to aid with optimising this task, particularly for difficult to detect and hard to reach situations such as during minimally invasive surgery.
The research team included a mix of postgraduate and undergraduate researchers, supervised by Dr Antonia Tzemanaki from Bristol Robotics Laboratory. Lead author George Jenkinson explained: “There are conflicting ideas about how useful carrying out Clinical Breast Examinations (CBE) are for the health outcomes of the population.
“It’s generally agreed upon that if it is well performed, then it can be a very useful and low risk diagnostic technique.
“There have been a few attempts in the past to use technology to improve the standard to which healthcare professionals can perform a CBE by having a robot or electronic device physically palpate breast tissue. But the last decade or so of technological advances in manipulation and sensor technology mean that we are now in a better position to do this.
“The first question that we want to answer as part of this is whether a specialised manipulator can be demonstrated to have the dexterity necessary to palpate a realistic breast size and shape.”
The team created their manipulator using 3D printing and other Computerised Numerical Control techniques and employed a combination of laboratory experiments and simulated experiments on a fake (silicone) breast and its digital twin, both modelled on a volunteer at the Simulation and Modelling in Medicine and Surgery research group at Imperial College London.
The simulations allowed the team to perform thousands of palpations and test lots of hypothetical scenarios such as calculating the difference in efficiency when using two, three, or four sensors at the same time. In the lab, they were able to carry out the experiments on the silicone breast to demonstrate the simulations were accurate and to experimentally discover the forces for the real equipment.
George added: “We hope that the research can contribute to and complement the arsenal of techniques used to diagnose breast cancer, and to generate a large amount of data associated with it that may be useful in trying to identify large scale trends that could help diagnose breast cancer early.
“One advantage that some doctors have mentioned anecdotally is that this could provide a low-risk way to objectively record health data. This could be used, for example, to compare successive examinations more easily, or as part of the information packet sent to a specialist if a patient is referred for further examination.”
As a next step, the team will combine CBE techniques learned from professionals with AI, and fully equip the manipulator with sensors to determine the effectiveness of the whole system at identifying potential cancer risks.
The ultimate goal is that the device and sensors will have the capability to detect lumps more accurately and deeper than it is possible only from applying human touch. It could also be combined with other existing techniques, such as ultrasound examination.
“So far we have laid all of the groundwork,” said George. “We have shown that our robotic system has the dexterity necessary to carry out a clinical breast examination – we hope that in the future this could be a real help in diagnosing cancers early.”
In the early 1900s, Japanese scientist Kikunae Ikeda first proposed umami as a basic taste in addition to sweet, sour, salty and bitter. About eight decades later, the scientific community officially agreed with him. Now, scientists led by researchers at the USC Dornsife College of Letters, Arts and Sciences have evidence of a sixth basic taste, which fans of salt licorice will recognise.
In research published in Nature Communications, USC Dornsife neuroscientist Emily Liman and her team found that the tongue responds to ammonium chloride through the same protein receptor that signals sour taste.
Salt licorice has been a popular sweet in northern European countries since at least since the early 20th century, and also appears on South African shelves. The treat counts among its ingredients salmiak salt, or ammonium chloride.
Scientists have for decades recognised that the tongue responds strongly to ammonium chloride. However, despite extensive research, the specific tongue receptors that react to it remained elusive.
Liman and the research team thought they might have an answer. In recent years, they uncovered the protein responsible for detecting sour taste. That protein, called OTOP1, sits within cell membranes and forms a channel for hydrogen ions moving into the cell.
Hydrogen ions are the key component of acids, and as foodies everywhere know, the tongue senses acid as sour, such as the citric acid in lemon juice. Hydrogen ions from these acidic substances move into taste receptor cells through the OTOP1 channel.
Because ammonium chloride can affect the concentration of acid – that is, hydrogen ions – within a cell, the team wondered if it could somehow trigger OTOP1.
To answer this question, they introduced the Otop1 gene into lab-grown human cells so the cells produce the OTOP1 receptor protein. They then exposed the cells to acid or to ammonium chloride and measured the responses.
“We saw that ammonium chloride is a really strong activator of the OTOP1 channel,” Liman said. “It activates as well or better than acids.”
Ammonium chloride gives off small amounts of ammonia, which moves inside the cell and raises the pH, meaning fewer hydrogen ions.
“This pH difference drives a proton influx through the OTOP1 channel,” explained Ziyu Liang, a PhD student in Liman’s lab and first author on the study.
To confirm that their result was more than a laboratory artifact, they turned to a technique that measures electrical conductivity, simulating how nerves conduct a signal. Using taste bud cells from normal mice and from mice the lab previously genetically engineered to not produce OTOP1, they measured how well the taste cells generated electrical responses called action potentials when ammonium chloride is introduced.
Taste bud cells from wildtype mice showed a sharp increase in action potentials after ammonium chloride was added while taste bud cells from the mice lacking OTOP1 failed to respond to the salt. This confirmed their hypothesis that OTOP1 responds to the salt, generating an electrical signal in taste bud cells.
The same was true when another member of the research team, Courtney Wilson, recorded signals from the nerves that innervate the taste cells. She saw the nerves respond to addition of ammonium chloride in normal mice but not in mice lacking OTOP1.
Then the team went one step further and examined how mice react when given a choice to drink either plain water or water laced with ammonium chloride. For these experiments, they disabled the bitter cells that also contribute to the taste of ammonium chloride. Mice with a functional OTOP1 protein found the taste of ammonium chloride unappealing and did not drink the solution, while mice lacking the OTOP1 protein did not mind the alkaline salt, even at very high concentrations.
“This was really the clincher,” Liman said. “It shows that the OTOP1 channel is essential for the behavioral response to ammonium.”
But the scientists weren’t done. They wondered if other animals would also be sensitive to and use their OTOP1 channels to detect ammonium. They found that the OTOP1 channel in some species seems to be more sensitive to ammonium chloride than in other species. And human OTOP1 channels were also sensitive to ammonium chloride.
So, what is the advantage in tasting ammonium chloride and why is it evolutionarily so conserved?
Liman speculates that the ability to taste ammonium chloride might have evolved to help organisms avoid eating harmful biological substances that have high concentrations of ammonium.
“Ammonium is found in waste products – think of fertiliser – and is somewhat toxic,” she explained, “so it makes sense we evolved taste mechanisms to detect it. Chicken OTOP1 is much more sensitive to ammonium than zebra fish.” Liman speculates that these variations may reflect differences in the ecological niches of different animals. “Fish may simply not encounter much ammonium in the water, while chicken coops are filled with ammonium that needs to be avoided and not eaten.”
But she cautions that this is very early research and further study is needed to understand species differences in sensitivity to ammonium and what makes OTOP1 channels from some species sensitive and some less sensitive to ammonium.
Towards this end, they have made a start. “We identified a particular part of the OTOP1 channel – a specific amino acid – that’s necessary for it to respond to ammonium,” Liman said. “If we mutate this one residue, the channel is not nearly as sensitive to ammonium, but it still responds to acid.”
Moreover, because this one amino acid is conserved across different species, there must have been selective pressure to maintain it, she says. In other words, the OTOP1 channel’s ability to respond to ammonium must have been important to the animals’ survival.
In the future, the researchers plan to extend these studies to understand whether sensitivity to ammonium is conserved among other members of the OTOP proton family, which are expressed in other parts of the body, including in the digestive tract.
And who knows? Perhaps ammonium chloride will join the other five basic tastes to bring the official count to six.
Researchers have used wearable technology to measure electrical impulses in the skin and other physiological biomarkers possibly linked to mood changes in bipolar disorder. Though at an early stage, they hope their work will be able to build on these patterns to detect mood swings in bipolar disorder sufferers, so helping in diagnosis and potentially offering more rapid and personalised treatments. They presented their research at the 36th ECNP Congress in Barcelona, and more information is available on GitHub.
Bipolar disorder (formerly called manic-depressive illness or manic depression) is a mental illness that causes swings in a person’s mood, energy, activity levels, and concentration. These shifts can make it difficult to carry out day-to-day tasks and can make interactions with other people difficult. The degree of mood swing can vary from person to person, from feeling manic (very “up”) to feeling depressed. At present, these mood swings are mostly diagnosed subjectively, through interview with doctors or by questionnaires. This takes time, and requires an immediate medical presence.
Now a group of Barcelona-based psychiatrists, in collaboration with data scientist in Edinburgh, have used a research grade wearable device to continuously collect several physiological biomarkers during the diverse phases and episodes of bipolar disorder. Among the collected physiological biomarkers is electrodermal activity which uses changes in the skin’s electrical conductivity to indicate the level of stress through the reactivity of the nervous system. This is a potential immediate indicator of whether someone is in a manic, depressive or in a normal mood state.
They recruited 38 patients with bipolar disorder, and 19 healthy controls, all from the Barcelona area.
Researcher Diego Hidalgo-Mazzei said “Each participant was fitted with a commercially available Empatica E4 bracelet, which they were asked to wear for around 48 hours. This can measure a variety of physiological changes, but we were most interested in measuring small electrophysiological changes in the skin of the wearer. We found that bipolar disorder patients in their depressed phase had on average a significantly lower skin electrical activity than the rest of the bipolar group or the healthy control group. We also found that as an individual moved from manic to depressive state (or vice versa), this was detectable by a change in skin surface electrical activity.
“It is important for the patient and doctor to know how and when these mood fluctuations take place. It is important also to highlight that the treatment is different for manic or depressive states. This can help with a prompt diagnosis and early personalized treatment, but it can also help in preventing adverse outcomes, for example in alerting to an increased risk of suicide, or of mood swings which may lead to dangers with activities such as driving. It is also easier to treat patients if we know if they are in a manic phase or a depressed phase. Until now, these mood swings have mostly been diagnosed subjectively, through interview with doctors or by questionnaires, and this had led to real difficulties. Arriving at the correct drug is difficult, with only around 30 to 40% of treated individuals having the expected response. We hope that the additional information these systems can provide will give us greater certainty in treating patients.
“We are still some way from that though. This is an exploratory observational study, so we need to look at a larger sample and use machine learning to analyse all the biomarkers collected by the wearers to confirm the findings to determine patterns which might indicate a specific episode. This may not be ideal for every bipolar disorder sufferer, in every circumstance, but a potential pattern may help in the future the people hardest hit by the mood changes which affect their lives”.
Improved modelling of male and female livers can help lead to safer drugs
Researchers report in PLOS Computational Biology that they developed a powerful new tool to understand how medications affect men and women differently, and that will help lead to safer, more effective drugs in the future.
Women are known to suffer a disproportionate number of liver problems from medications but also usually underrepresented in drug testing. To address this, University of Virginia scientists have developed sophisticated computer simulations of male and female livers and used them to reveal sex-specific differences in how the tissues are affected by drugs.
The new model has already provided unprecedented insights into the biological processes that take place in the liver, the organ responsible for detoxifying the body, in both men and women. But the model also represents a powerful new tool for drug development, helping ensure that new medications will not cause harmful side effects.
“There are incredibly complex networks of genes and proteins that control how cells respond to drugs,” said UVA researcher Jason Papin, PhD, one of the model’s creators. “We knew that a computer model would be required to try to answer these important clinical questions, and we’re hopeful these models will continue to provide insights that can improve healthcare.”
Harmful side effects
Papin, of UVA’s Department of Biomedical Engineering, developed the model in collaboration with Connor Moore, a PhD student, and Christopher Holstege, MD, a UVA emergency medicine physician and director of UVA Health’s Blue Ridge Poison Center. “It is exceedingly important that both men and women receive the appropriate dose of recommended medications,” Holstege noted. “Drug therapy is complex and toxicity can occur with subtle changes in dose for specific individuals.”
Before developing their model, the researchers first looked at the federal Food and Drug Administration’s Adverse Event Reporting System to evaluate the frequency of reported liver problems in men and women. The scientists found that women consistently reported liver-related adverse events more often than did men.
The researchers then sought to explain why this might be the case. To do that, they developed computer models of the male and female livers that integrated vast amounts of data on gene activity and metabolic processes within cells. These cutting-edge liver simulations provided important insights into how drugs (xenobiotics) affect the tissue differently in men and women and allowed the researchers to understand why.
They found that xenobiotic metabolism was more active in untreated males, while pentose and glucoronate interconversions were female-biased, suggesting a difference in pretreatment gene expression, which may result in different initial responses of phase I and phase II metabolism to hepatotoxic drugs. They also observed sex-bias in bile acid biosynthesis, which in combination with xenobiotic metabolism, this result may suggest differences in bacterial deconjugation driven by sex differences in the gut microbiome. Differences were also found in several essential metabolic pathways, such as glycolysis/gluconeogenesis, nucleotide metabolism, and lipid metabolism with supporting evidence in human or rat hepatocytes.
“We were surprised how many differences we found, especially in very diverse biochemical pathways,” said Moore, a biomedical engineering student in Papin’s lab. “We hope our results emphasise how important it is for future scientists to consider how both men and women are affected by their research.”
The work has already identified a key series of cellular processes that explain sex differences in liver damage, and the scientists are calling for more investigation of it to better understand “hepatotoxicity” — liver toxicity. Ultimately, they hope their model will prove widely useful in developing safer drugs.
“We’re hopeful these approaches will be help address many other questions where men and women have differences in drug responses or disease processes,” Papin said. “Our ability to build predictive computer models of complex systems in biology, like those in this study, is truly opening all kinds of new avenues for tackling some of the most challenging biomedical problems.”