Structural differences found in brains of people with panic disorder

Panic disorder (PD) is a mental health disorder characterized by recurring panic attacks, episodes of intense fear and anxiety accompanied by physical sensations and physiological responses such as a racing heart, shortness of breath, dizziness, blurred vision and other symptoms. Estimates suggest that approximately 2–3% of people worldwide experience PD at some point during their lives.

Better understanding the neural underpinnings and features of PD could have important implications for its future treatment. So far, however, most neuroscientific studies have examined the brains of relatively small groups of individuals diagnosed with the disorder, which poses questions about whether the patterns they observed are representative of PD at large.

Researchers at the Amsterdam University Medical Center, Leiden University and many other institutes worldwide recently carried out a new study shedding new light on the neuroanatomical signatures of PD, via the analysis of a large pool of brain scans collected from people diagnosed with the disorder and others with no known psychiatric diagnoses. Their paper, published in Molecular Psychiatry, identifies marked differences in the brains of individuals with PD, such as a slightly thinner cortex and frontal, temporal and parietal brain regions that are smaller than those of people with no known mental health disorders.

“Neuroanatomical examinations of PD typically involve small study samples, rendering findings that are inconsistent and difficult to replicate,” Laura K. M. Han and Moji Aghajani, first and senior authors of the paper, respectively, told Medical Xpress.

“This motivated the ENIGMA Anxiety Working Group to collate data worldwide using standardized methods, to conduct a pooled mega-analysis. The main goal was to provide the most reliable test to date of whether PD is associated with robust neuroanatomical alterations, and whether these differences may vary by age or clinical features (e.g., age of onset, medication use, severity).”

Uncovering the brain signatures of panic disorder

As part of their study, Han, Aghajani and their colleagues analyzed brain scans collected by different research teams worldwide from a total of almost 5000 people between 10 and 66 years old, including 1,100 individuals with PD and 3,800 healthy control subjects. The brain scans were collected using magnetic resonance imaging (MRI), an imaging technique that is commonly used by both scientists and doctors to study and diagnose various diseases.

“Using harmonized ENIGMA protocols and the FreeSurfer brain segmentation software, we measured cortical thickness, cortical surface area, and subcortical volumes,” explained Han and Aghajani. “Statistical mixed-effects models compared PD and healthy controls on these brain metrics, while accounting for individual variations in age, sex, and scanning site.”

The team’s analyses allowed them to pin-point various marked differences between the brains of people with PD and others with no known psychiatric or mental health disorders. The researchers found that people with PD had a slightly thinner cortex and that some parts of their brain had a smaller surface area or a reduced volume.

“We identified subtle but consistent reductions in cortical thickness and surface area in frontal, temporal, and parietal regions, along with smaller subcortical volumes within the thalamus and caudate volumes, among individuals with PD,” said Han and Aghajani.

“Among other things, these regions govern how emotionally salient information is perceived, processed, modulated, and responded to. The analyses also showed that some differences are age-dependent and that early-onset PD (before age 21) is linked to larger lateral ventricles.”

Paving the way for the detailed mapping of psychiatric disorders

Overall, the findings of this recent study appear to confirm existing models of PD that suggest that the disorder is linked to disruptions in brain regions associated with the processing and regulation of emotions. In the future, they could inspire other researchers to conduct further research that closely examines some of the newly uncovered neuroanatomical signatures of PD, perhaps also looking at how they change at different stages of development or when patients are responding well to specific treatments and psychotherapeutic interventions.

The team’s mega-analysis also highlights the value of examining large amounts of data, showing that this can contribute to the detection of subtle neuroanatomical changes or differences that might be hard to uncover in smaller samples. A similar approach has also been used to study the neuroanatomy of other neuropsychiatric disorders, such as generalized anxiety disorder (GAD), depression, obsessive compulsive disorder (OCD), bipolar disorder (BP), schizophrenia or substance use disorders (SUDs).

“Future studies could track individuals with PD longitudinally to clarify developmental and aging effects, integrate genetics and environmental risk factors, and combine structural imaging with functional and connectional brain examinations,” added Han and Aghajani. “The results also motivate transdiagnostic comparisons across anxiety disorders and efforts to link brain differences to prognosis, treatment response, or prevention strategies, rather than diagnosis alone.”

Point-of-care hepatitis B DNA testing proves as accurate as lab tests

A clinical trial led by Kirby Institute at UNSW Sydney has found that point-of-care testing for hepatitis B DNA is as effective as traditional laboratory testing, paving the way for faster diagnosis and treatment in hard-to-reach communities. The results have been published in the Journal of Clinical Microbiology.

“The results of our trial found that the fingerstick point-of-care test is highly accurate, closely matching the accuracy of traditional tests,” explains Professor Gail Matthews, who led the research at the Kirby Institute. “This is a very important finding because it has the potential to expand access to testing and treatment globally, and especially in resource-limited settings or remote areas, where current testing access is poor.”

Hepatitis B is a viral infection that attacks the liver, causing inflammation and, over time, serious complications such as cirrhosis, liver failure, and liver cancer. It is responsible for over 1 million deaths per year, but is preventable by vaccination, and effective treatment is available for chronic infection.

While most high-income countries have strong vaccination programs and reasonable access to care, the majority of people with chronic hepatitis B live in low- and middle-income countries, where access to testing and treatment is limited. Even in Australia, hepatitis B DNA testing is more difficult to access for people living in remote areas.

“Not everyone who has hep B needs treatment,” explains Associate Professor Behzad Hajarizadeh, who is first author on the paper. “People with higher levels of the virus are more likely to benefit from treatment, so DNA tests are required to determine the levels of virus in the system. DNA testing is also used once a patient starts treatment, to help understand if the treatment is working.”

Currently, hepatitis B DNA testing, for both diagnosis and monitoring, requires collecting a venous blood sample to be processed in centralized laboratories, meaning patients can need to travel long distances to take the test, and then often wait days or weeks for results. This delay, and the multiple clinic visits involved, can hinder timely treatment and care.

By contrast, point-of-care is a type of test that can be done in small health clinics using a fingerstick blood sample, which can be performed by a broader range of health care workers, and provides a result within 60 minutes. It is an effective alternative to laboratory testing for many infectious diseases, including hepatitis C, but until now, its efficacy for hepatitis B DNA using finger stick blood has been unknown.

“Our research demonstrates that point-of-care testing for hepatitis B DNA using fingerstick blood is, indeed, highly accurate and effective. Given the technology is already in use for a range of other infectious diseases globally, our evidence paves the way for integrating infectious disease care significantly enhancing access to hepatitis B testing, monitoring and treatment, no matter where someone lives,” says the Kirby Institute’s Associate Professor Tanya Applegate.

Most recent World Health Organization figures (2022) estimate that there are 254 million people living with chronic hepatitis B infection worldwide, yet only 14% were diagnosed and just 8% were receiving treatment, representing a major global health challenge. It is currently estimated that no country is on track to meet the WHO target of elimination of hepatitis B as a public health threat by 2030. As part of a push to increase testing and treatment, most recent WHO guidelines include a new recommendation supporting the use of hepatitis B point-of-care DNA fingerstick tests globally. Data from this study supports that recommendation.

“Access to testing is a major barrier to progress on hepatitis B elimination,” says Associate Professor Thomas Tu from Hepatitis B Voices. “We are hopeful that this research will support the roll-out of point-of-care testing for hepatitis B, enhancing access and ultimately, improving health and saving lives.”

i-DNA ‘peek-a-boo structures’ form in living cells and regulate genes linked to cancer

DNA’s iconic double helix does more than “just” store genetic information. Under certain conditions, it can temporarily fold into unusual shapes. Researchers at Umeå University, Sweden, have now shown that one such structure, known as i-DNA, not only forms in living cells but also acts as a regulatory bottleneck linked to cancer.

“You can think of i-DNA as a kind of ‘peek-a-boo structure’ in the DNA molecule. Its formation is tightly controlled in time and it must be resolved at precisely the right moment. We believe it plays an important role in gene regulation, because these structures can appear and disappear in sync with changes in the cell’s state,” says first author Pallabi Sengupta, postdoctoral researcher at the Department of Medical Biochemistry and Biophysics at Umeå University. The study is now published in Nature Communications.\

A highly unusual DNA structure

The familiar double helix can be imagined as a twisted ladder with sugar-phosphate backbones as side rails and base pairs—adenine (A) paired with thymine (T), and cytosine (C) paired with guanine (G)—forming the rungs.

i-DNA, however, bears little resemblance to this shape. Instead, it is more like a distorted, self-folded ladder tied into a knot. It consists of a single DNA strand folding back on itself to form a four-stranded structure. At the molecular level, the structure is held together not by standard A–T and C–G base pairs, but by pairs of cytosines.

These rare, short-lived structures appear and disappear depending on the cellular environment. For decades, they were dismissed as too unstable to exist inside cells and regarded as laboratory artifacts. With new experimental techniques, researchers in Umeå can now demonstrate that i-DNA does form, but only briefly, just before DNA replication begins.

Key protein controls structure resolution

The study further shows that the protein PCBP1 acts as a critical regulator. It unwinds i-DNA at the right moment, allowing the DNA replication machinery to proceed. If the structures fail to open in time, they block replication, increasing the risk of DNA damage—a hallmark of heightened cancer vulnerability.

The researchers also discovered that i-DNA is not uniform: some structures are easy to unwind, while others are highly resistant, depending on the underlying DNA sequence.

“The more cytosine base pairs that hold the knot together, the harder it is to resolve. In some cases, hybrid structures can form, making i-DNA even more stable,” explains Nasim Sabouri, professor at the Department of Medical Biochemistry and Biophysics at Umeå University, who led the study.

Notably, many i-DNA structures are located in regulatory regions of oncogenes—genes that drive cancer development—suggesting a direct link between i-DNA and disease.

To study these short-lived structures, the team combined biochemical assays, computational modeling and cell biology. They successfully visualized how PCBP1 progressively opens i-DNA and captured the structures in living cells at the exact moment in the cell cycle when they appear.

“By connecting molecular mechanisms to actual effects in cells, we can show that this is biologically relevant and not a laboratory phenomenon,” says Ikenna Obi, staff scientist at the Department of Medical Biochemistry and Biophysics at Umeå University.

New opportunities for drug development

The discovery reframes i-DNA from a molecular oddity to a potential weakness in cancer cells. Because cancer cells often experience high replication stress attempting to divide so rapidly that their DNA replication machinery approaches breakdown, any disruption in i-DNA handling may have severe consequences.

“If we can influence i-DNA or the protein that unwinds it, we may be able to push cancer cells beyond their tolerance limit. This opens completely new avenues for drug development,” says Nasim Sabouri.

The study was conducted in collaboration with Natacha Gillet, researcher at the Centre National de la Recherche Scientifique (CNRS) in France.

Surgical innovation may cut ovarian cancer risk by nearly 80%

A prevention strategy developed by Canadian researchers can reduce the risk of the most common and deadly form of ovarian cancer by nearly 80%, according to a new study published today in JAMA Network Open by researchers at the University of British Columbia (UBC).

The strategy, known as opportunistic salpingectomy (OS), involves proactively removing a person’s fallopian tubes when they are already undergoing a routine gynecological surgery such as hysterectomy or tubal ligation, commonly called “having one’s tubes tied.”

The Canadian province of British Columbia (B.C.) became the first jurisdiction in the world to offer OS in 2010, after a team of researchers from UBC, BC Cancer and Vancouver Coastal Health designed the approach when it was discovered that most ovarian cancers originate in the fallopian tubes rather than the ovaries. OS leaves a person’s ovaries intact, preserving important hormone production so there are minimal side effects from the added procedure.

The new study, led by a B.C.-based international collaboration called the Ovarian Cancer Observatory, provides the clearest evidence yet that the Canadian innovation saves lives.

“This study clearly demonstrates that removing the fallopian tubes as an add-on during routine surgery can help prevent the most lethal type of ovarian cancer,” said co-senior author Dr. Gillian Hanley, an associate professor of obstetrics and gynecology at UBC. “It shows how this relatively simple change in surgical practice can have a profound and life-saving impact.”

New hope against a deadly cancer

Ovarian cancer is the most lethal gynecological cancer. Approximately 3,100 Canadians are diagnosed with the disease each year and about 2,000 will die from it.

There is currently no reliable screening test for ovarian cancer, meaning that most cases are diagnosed at advanced stages when treatment options are limited and survival rates are low.

The OS approach was initially developed and named by Dr. Dianne Miller, an associate professor emerita at UBC and gynecologic oncologist with Vancouver Coastal Health and BC Cancer. She co-founded B.C.’s multidisciplinary ovarian cancer research team, OVCARE.

“If there is one thing better than curing cancer, it’s never getting the cancer in the first place,” said Dr. Miller.

The new study is the first to quantify how much OS reduces the risk of serous ovarian cancer—the most common and deadly subtype of the disease. It builds on previous research demonstrating that OS is safe, does not reduce the age of menopause onset, and is cost-effective for health systems.

The study analyzed population-based health data for more than 85,000 people who underwent gynecological surgeries in B.C. between 2008 and 2020. The researchers compared rates of serous ovarian cancer between those who had OS and those who had similar surgeries but did not undergo the procedure.

Overall, people who had OS were 78% less likely to develop serous ovarian cancer. In the rare cases where ovarian cancer occurred after OS, those cancers were found to be less biologically aggressive. The findings were validated by data collected from pathology laboratories from around the world, which suggested a similar effect.

From B.C. innovation to global impact

Since its introduction in B.C. in 2010, OS has been widely adopted, with approximately 80% of hysterectomies and tubal ligation procedures in the province now including fallopian tube removal.

Globally, professional medical organizations in 24 countries now recommend OS as an ovarian cancer prevention strategy, including the Society of Obstetrics and Gynaecology of Canada, which issued guidance in 2015.

“This is the culmination of more than a decade of work that started here in B.C.,” said co-senior author Dr. David Huntsman, professor of pathology and laboratory medicine and obstetrics and gynecology at UBC and a distinguished scientist at BC Cancer. “The impact of OS that we report is even greater than we expected.”

The researchers say expanding global adoption of OS could prevent thousands of ovarian cancer cases worldwide each year.

Extending OS to other abdominal and pelvic surgeries where appropriate could further increase the number of people who could benefit from the prevention strategy. B.C. recently became the first province to expand OS to routine surgeries performed by general and urologic surgeons through a project supported by the Government of B.C. and Doctors of BC.

“Our hope is that more clinicians will adopt this proven approach, which has the potential to save countless lives,” said Dr. Huntsman. “Not offering this surgical add-on may leave patients unnecessarily vulnerable to this cancer.”

Stimulating the central thalamus during anesthesia sheds light on neural basis of consciousness

The brains of mammals continuously combine signals originating from different regions to produce various sensations, emotions, thoughts and behaviors. This process, known as information integration, is what allows brain regions with different functions to collectively form unified experiences.

When mammals are unconscious, for instance when they are under general anesthesia, the brain temporarily loses its ability to integrate information. Studying the mammalian brain both when animals are awake and unconscious could help to better understand the neural processes that contribute to consciousness, potentially offering insight into comatose states and other disorders characterized by alterations in wakefulness.

Researchers at University of Cambridge, University of Oxford, McGill University and other institutes worldwide set out to examine the brains of four different species of mammals during anesthesia. Their observations, published in Nature Human Behaviour, offer new insight into the brain regions and gene patterns associated with both unconsciousness and the regaining of consciousness.

“The paper is part of my research program on the neural basis of consciousness,” Andrea Luppi, first author of the paper, told Medical Xpress.

“For the last 10 years I have been pursuing this question. My earlier work focused on comparing what happens to the brain during the unconsciousness induced by general anesthesia, and during coma or other disorders of consciousness (such as what used to be called vegetative state).

“Our paper asks if anesthesia works similarly in the brains of humans, and of other species that are often used as models in neuroscience and clinical research.”

Switching the brain back ‘on’ during anesthesia

Luppi and his colleagues have been investigating the neural processes involved in conscious and unconscious states for almost a decade. Their recent paper focuses on four different animal species: humans, macaques, marmosets and mice.

“Our hope is that by studying different mammals and comparing them with humans, we may be able to narrow down on the most essential mechanisms of consciousness—and learn how to restore it in patients,” said Luppi.

As part of their study, the researchers measured the brain activity of humans and three types of animals they scanned while they were under general anesthesia, using functional magnetic resonance imaging (fMRI). This is a widely used and non-invasive imaging technique that measures brain activity by detecting changes in blood flow.

“Our approach allowed us to track over time how different brain regions interact,” explained Luppi. “We found that when humans and animals are awake, their brains are like a grand orchestra: though different brain parts play different roles, they are clearly all working together to produce the symphony. We call this ‘synergy.'”

The team observed that this orchestra-like collective activity ceases when all the animals they examined lose consciousness. However, they were able to restore it by stimulating the central thalamus, a region at the center of the brain that is known to relay sensory and motor information, but that may also be acting as conductor for the brain’s orchestra.

“The anesthetized brain is like a random assortment of instruments, each playing to its own tune regardless of what the others are playing,” said Luppi. “However, if you stimulate a small region deep in the brain, called the central thalamus, the animal wakes up from anesthesia—and the brain symphony is back.”

Using computational tools, Luppi and his colleagues modeled the connectivity between different brain areas and how different genes are expressed across areas, both while animals were unconscious and when they regained consciousness. This allowed them to identify neural mechanisms that play a key role in consciousness and that appear to be evolutionarily conserved across all the species they examined.

New insight into the neural roots of consciousness

This recent study improves the present understanding of how the brain restores wakefulness. In the future, the team’s observations could help to devise new treatments for disorders of consciousness that can emerge after brain injuries, infections or tumors, such as comatose, vegetative, minimally conscious and post-traumatic confusional states.

“Finding consistency across many species and many anesthetic drugs is important: what is conserved across evolution is often very fundamental,” said Luppi.

“Perhaps the most important contribution of our study is that we were able to build a computer model that predicts which region one should stimulate, to have the best chances of making the brain symphony-like again. This could be used for trying to identify which region one should stimulate in the brain of a chronically unconscious patient, to try and wake the patient up.”

Luppi and his colleagues are now planning further studies aimed at further exploring the neural mechanisms associated with a return of consciousness after periods of unconsciousness. Their hope is to ultimately inform the design of more reliable and targeted strategies to bring patients back from a coma or other prolonged unconscious states.

“My long-term goal is to understand the mechanisms that govern consciousness, and how we can use pharmacology or brain stimulation to restore consciousness in patients,” added Luppi.

Scientists now know why ovarian cancer spreads so rapidly in the abdomen

Ovarian cancer kills more women than any other gynecological cancer. Most patients receive their diagnosis only after the disease spreads throughout the abdomen. Until now, scientists have never fully understood why this cancer advances so fast.

A new study led by Nagoya University explains why. Published in Science Advances, the study shows that cancer cells recruit help from protective mesothelial cells that normally line the abdominal cavity. Mesothelial cells lead the invasion and cancer cells follow the pathways they create. These hybrid cell clusters resist chemotherapy better than cancer alone.

Researchers examined abdominal fluid from ovarian cancer patients and found something unexpected. Cancer cells do not float alone in the abdominal cavity. Instead, they often grab onto mesothelial cells and form hybrid spheres. About 60% of all cancer spheres contain these recruited mesothelial cells. The cancer cells release a protein called TGF-β1 that transforms the mesothelial cells and causes them to develop spike-like structures that cut through tissue.

Invadopodia, spike structures that do the digging for cancer

When ovarian cancer develops, cancer cells break off from the tumor. These cells enter the abdominal fluid and float freely. The fluid moves around as you breathe and move your body. This movement carries the cancer cells to different spots in the abdomen.

Most other cancers spread differently. Breast cancer or lung cancer cells enter blood vessels. They travel through the bloodstream to reach distant organs. Doctors can sometimes track these cancers through blood tests because blood moves in predictable paths through vessels.

Ovarian cancer cells avoid blood vessels entirely. They float in fluid that has no fixed path. This floating stage happens before the cancer cells attach to new organs. Scientists did not fully understand what happened during the floating period or how cells worked together to spread cancer so quickly.

The research team discovered that cancer cells recruit protective mesothelial cells that have shed from the abdominal cavity lining during this floating stage. The two cell types stick together and form hybrid spheres. The mesothelial cells then grow invadopodia, spike-like structures that drill into surrounding tissue. The hybrid spheres resist chemotherapy drugs more effectively and invade tissues faster when they land on organs.

Outsourcing the hard work of cell invasion

The researchers examined abdominal fluid from ovarian cancer patients using advanced microscopy to watch this process in real time. They confirmed their findings with mouse models and single-cell genetic analysis.

Lead author Dr. Kaname Uno, a former Ph.D. student and current Visiting Researcher at Nagoya University’s Graduate School of Medicine, explained that the cancer cells do not need to become more invasive themselves.

“They manipulate mesothelial cells to do the tissue invasion work. They undergo minimal genetic and molecular changes and just migrate through the openings that mesothelial cells create.”

Dr. Uno worked as a gynecologist for eight years before he pursued research. One of his patients changed his career path. She had clear screening results just three months before doctors found advanced ovarian cancer. Current medical tools failed to detect the cancer early enough to save her life. This motivated Dr. Uno to investigate why ovarian cancer spreads so rapidly.

This discovery opens new treatment possibilities. Current chemotherapy targets cancer cells but ignores the mesothelial accomplices. Future drugs could block the TGF-β1 signal or prevent the formation of these dangerous partnerships. The research also suggests that doctors could monitor these cell clusters in abdominal fluid to predict disease progression and treatment response.

Nasal spray for flu prevention shows promising trial results

Researchers have developed a nasal spray for flu prevention that has shown promising results in preliminary human trials. Seasonal influenza (the flu) is an acute respiratory infection that affects up to one billion people per year and causes hundreds of thousands of deaths. While flu shots can be effective, they are always a best guess because scientists are never fully certain about which strains will circulate. At best, they are only 50% effective, according to historical data.

Another way to try and tackle the flu is experimental monoclonal antibodies, which are proteins that mimic the immune system’s ability to fight off viruses, but these too have had limited success.

Spraying inside the nostrils

Scientists from Leyden Labs in the Netherlands and their partners have been working on a different approach: a spray you apply directly to the nostrils to block the virus at the point of entry. Their research is published in the journal Science Translational Medicine.

The spray contains an antibody called CR9114, which was developed by the pharmaceutical company Johnson & Johnson. Unlike conventional vaccines that can only recognize specific flu strains, CR9114 can recognize and block almost all types of influenza A and B.

Following successful tests in mice and macaques, in which the spray protected the animals against infection, the researchers conducted preliminary tests in 143 people aged 18 to 55.

Healthy volunteers were given either a single dose or a twice-daily dose for two weeks to test its safety and how it moves through the body (pharmacokinetics). The research team measured how long the antibody remained in the nose by collecting samples at different times. They also placed nasal samples in lab dishes with live flu viruses to see if CR9114 could neutralize the infection.

Optimal doses and next steps

No serious side effects were reported in any of the participants. Twice-daily doses were optimal because the nose naturally clears itself continually, and the antibody has a half-life of three hours. Even after being in the nose, the antibody was fully active and neutralized both influenza A and B.

One of the most impressive findings was that spraying the antibody directly into the nose resulted in antibody concentrations in the nasal lining up to 4,600 times higher than traditional IV methods and with smaller doses.

“Intranasal CR9114 is safe in humans and efficacious against influenza virus challenge in nonhuman primates. These studies pave the way toward intranasal antibody administration for broad, prophylactic protection against influenza virus infection and subsequent disease.”

While the research is promising, the scientists still need to confirm how well their spray works against natural infection and whether it stops the virus from spreading between people.

Written for you by our author Paul Arnold, edited by Gaby Clark, and fact

Statins do not cause the majority of side effects listed in package leaflets, large-scale analysis finds

Statins do not cause the majority of the conditions that have been listed in their package leaflets, including memory loss, depression, sleep disturbance, and erectile and sexual dysfunction, according to the most comprehensive review of possible side effects. The study was led by researchers at Oxford Population Health and appears in The Lancet.

Cardiovascular disease results in around 20 million deaths worldwide and causes around a quarter of all deaths in the UK. Statins are highly effective drugs that lower LDL (“bad”) cholesterol levels and have been repeatedly proven to reduce the risk of cardiovascular disease. However, there have been concerns about possible side effects.

The researchers gathered data from 23 large-scale randomized studies from the Cholesterol Treatment Trialists’ Collaboration: 123,940 participants in 19 large-scale clinical trials comparing the effects of statin therapies against a placebo (or dummy tablet), and 30,724 participants in four trials comparing more intensive versus less intensive statin therapy.

They found similar numbers of reports for those taking the statins and those taking the placebo for almost all the conditions listed in package leaflets as possible side effects. For example, each year, the number of reports of cognitive or memory impairment was 0.2% in those taking the statins, but also 0.2% in those taking the placebo. This means that while people may notice these problems while taking statins, there is no good evidence that they are caused by the statin.

Key findings:

  • There was no statistically significant excess risk from statin therapy for almost all the conditions listed in package leaflets as potential side effects.
  • Taking a statin did not cause any meaningful excess of memory loss or dementia, depression, sleep disturbance, erectile dysfunction, weight gain, nausea, fatigue or headache, and many other conditions.
  • There was a small increase in risk (about 0.1%) for liver blood test abnormalities. However, there was no increase in liver disease such as hepatitis or liver failure, indicating that the liver blood test changes do not typically lead to more serious liver problems.

Christina Reith, Associate Professor at Oxford Population Health and lead author of the study, said, “Statins are life-saving drugs that have been used by hundreds of millions of people over the past 30 years. However, concerns about the safety of statins have deterred many people who are at risk of severe disability or death from a heart attack or stroke. Our study provides reassurance that for most people, the risk of side effects is greatly outweighed by the benefits of statins.”

Previous work by the same researchers established that most muscle symptoms are not caused by statins; statin therapy caused muscle symptoms in only 1% of people during the first year of treatment with no excess thereafter. It has also shown that statins can cause a small increase in blood sugar levels, so people already at high risk may develop diabetes sooner.

Professor Bryan Williams, Chief Scientific and Medical Officer at the British Heart Foundation, noted, “These findings are hugely important and provide authoritative, evidence-based reassurance for patients. Statins are lifesaving drugs that have been proven to protect against heart attacks and strokes. Among the large number of patients assessed in this well-conducted analysis, only four side effects out of 66 were found to have any association with taking statins, and only in a very small proportion of patients.

“This evidence is a much-needed counter to the misinformation around statins and should help prevent unnecessary deaths from cardiovascular disease. Recognizing which side effects might genuinely be associated with statins is also important as it will help doctors make decisions about when to use alternative treatments.”

Professor Sir Rory Collins, Emeritus Professor of Medicine and Epidemiology at Oxford Population Health and senior author of the paper, added, “Statin product labels list certain adverse health outcomes as potential treatment-related effects based mainly on information from non-randomized studies that may be subject to bias. We brought together all of the information from large randomized trials to assess the evidence reliably.

“Now that we know that statins do not cause the majority of side effects listed in package leaflets, statin information requires rapid revision to help patients and doctors make better-informed health decisions.”

All of the trials included in the analyses were large-scale (involving at least 1,000 participants) and tracked patient outcomes for a median of nearly five years. The trials were double-blind, meaning that neither the trial participants nor those managing the participants or leading the study knew who was receiving which treatment, to avoid potential biases due to knowledge of treatment allocation. The list of possible side effects was compiled from those listed for the five most commonly prescribed statins.

Family dinners may reduce substance-use risk for many adolescents

A new study by researchers at Tufts University School of Medicine finds that regular family dinners may help prevent substance use for a majority of U.S. adolescents, but suggests that the strategy is not effective for youth who have experienced significant childhood adversity.

The findings provide important insights for practitioners looking to help families prevent substance use, as well as for researchers aiming to develop interventions that better account for adolescents’ unique experiences.

For the study, published in the Journal of Aggression, Maltreatment & Trauma, researchers analyzed online survey data from 2,090 U.S. adolescents ages 12 to 17 and their parents. Participants from around the country were asked about the quality of their family meals—including communication, enjoyment, digital distractions, and logistics—as well as adolescents’ alcohol, e-cigarette, and cannabis use in the previous six months.

The researchers then examined how these patterns differed based on adolescents’ experiences of household stressors and exposure to violence, as reported by both the children and parents. Instead of counting each adverse experience equally, the researchers created a weighted score based on how strongly the different experiences are linked to substance use in prior research and this national sample.

Higher family dinner quality was associated with a 22% to 34% lower prevalence of substance use among adolescents who had either no or low to moderate levels of adverse childhood experiences.

“These findings build on what we already knew about the value of family meals as a practical and widely accessible way to reduce the risk of adolescent substance use,” said Margie Skeer, the study’s lead author, professor and chair of the Department of Public Health and Community Medicine at the School of Medicine.

“Routinely connecting over meals—which can be as simple as a caregiver and child standing at a counter having a snack together—can help establish open and routine parent-child communication and parental monitoring to support more positive long-term outcomes for the majority of children,” added Skeer.

“It’s not about the food, timing, or setting; it’s the parent-child relationship and interactions it helps cultivate that matter.”

Adverse childhood experiences reported by participants in the study included parents being divorced; a family member being diagnosed with a substance-use disorder; someone in the family having a mental-health disorder; the adolescent witnessing violence; the adolescent often being teased about their weight; a parent using non-prescribed drugs daily; or the adolescent experiencing sexual or physical dating violence.

The study found that family meals offered little protection for adolescents whose adversity score reached the equivalent of four or more experiences—a population that encompasses nearly one in five U.S. high school students younger than 18, according to a study of the most recent Youth Risk Behavior Survey data.

“While our research suggests that adolescents who have experienced more severe stressors may not see the same benefits from family meals, they may benefit from more targeted and trauma-informed approaches, such as mental health support and alternative forms of family engagement,” said Skeer.

She added that future research should explore whether other supportive routines—beyond shared meals or outside the family environment—can help protect adolescents exposed to highly stressful or traumatic childhood experiences.

Addiction and appetite along the gut-brain axis: Vagus nerve may play a crucial role in the dopamine reward pathway

Dopamine—a neurotransmitter responsible for influencing motivation, pleasure, mood and learning in the brain—has experienced a bit of fame in recent years, acting as a sort of buzzword to describe a fleeting satisfaction from social media, food or shopping. Because of this, most people know that dopamine acts within the brain. In particular, it is associated with the mesolimbic pathway, which is a brain circuit connecting the ventral tegmental area (VTA) to the nucleus accumbens (NAc), amygdala, and hippocampus.

However, a recent study, published in Science Advances, indicates that the vagus nerve, which bridges the brain and gut, also plays a crucial role in regulating behaviors related to reward and motivation.

The gut-brain-vagal axis

The vagus nerve is the main pathway of the gut-brain axis, a complex communication network linking peripheral organs to the brain by transmitting interoceptive signals about mood, digestion, inflammation, and stress.

The authors of the new study explain, “Among metabolically active peripheral organs, the gut emerges as a central player in coordinating the body-brain tango through a multitude of long-range mechanisms, including hormonal signaling, microbiota-derived metabolites, and both local and gut-brain neuronal connections.”

While most prior studies have focused on brain-centric models of reward, some work has shown that gut-vagal signals have an effect on food-driven dopamine activity and eating behaviors. Yet it was still unclear whether this would extend to other forms of addiction fueled by dopamine.

Disrupted vagal signaling affects dopamine activity

To determine the extent to which the gut-brain-vagal axis is involved in dopamine reward activity, the research team conducted an array of experiments involving mice. Some experiments involved cutting the vagus nerve via subdiaphragmatic vagotomy (SDV) and comparing food and drug reward behaviors between SVA mice and unaltered (sham) mice. In vivo dopamine activity was also monitored through fiber photometry, molecular assays, and electrophysiology.

Results showed that the gut-brain vagal axis is essential for both food- and drug-induced reward behaviors in mice. In experiments featuring foods that mice would normally find addictive in nature, the SDV mice showed a slower and lower rate of food consumption, while the unaltered mice exhibited a rapid increase in food consumption over a 10-day period.

The team noted increased excitement in sham mice, but not SDV mice. They write, “Using our model, in combination with telemetric locomotor activity monitoring, we observed that sham mice displayed an increased locomotor activity before (food-anticipatory activity) and during (consumption) food intake.

“In contrast, SDV mice exhibited dampened locomotor activity during both phases. This reduction was not due to preexisting locomotor deficits, as both sham and SDV mice had similar locomotor profiles during the dark period (foraging period) or under basal conditions.”

Similar results were found with some experiments involving drugs, including cocaine, morphine and amphetamines. The team observed reductions in the elicited locomotor response in SDV mice for both morphine and cocaine, indicating that the vagus nerve might modulate dopamine dynamics and/or its postsynaptic integration. Amphetamines showed no significant differences and also depended on dose in conditioning experiments.

The study authors explain, “While sham mice were positively conditioned to cocaine, no significant preference was observed in SDV mice. The effect of amphetamine-induced CPP in SDV mice depended on the conditioning doses. At 2 mg/kg, we observed that both experimental groups were positively conditioned.

“However, when mice were conditioned with a lower dose of amphetamine (1 mg/kg), we observed a lower conditioning index in SDV mice compared to controls, suggesting that the physiological consequences of neuronal adaptations observed in SDV mice may be overridden at higher [dopamine] levels.”

Furthermore, in vivo experiments showed that vagal integrity is required for normal dopamine neuron firing, dopamine-dependent molecular changes and structural plasticity in reward circuits. Fiber photometry showed that when the vagus nerve was cut, dopamine responses were delayed within the nucleus accumbens or reduced during food anticipation, eating, and after drug administration.

However, dopamine function overall was not compromised, as it still functioned in processes related to movement. Still, activity was reduced, as dopamine neurons fired less and received weaker excitatory input.

Implications for addiction treatment in humans

The study helps to confirm that our gut, via the vagus nerve, plays a direct and essential role in how we experience reward and motivation. However, addiction treatments involving the reduction of vagus nerve signaling are still a way off.

Simply cutting the vagus nerve off surgically, as in the mouse study, is most likely not an option for humans and may have further side effects. In addition, the team notes that the gut may even induce compensatory changes over time to make up for lost signaling.

Clearly, more research is needed to refine these methods. The team suggests using more targeted genetic or viral tools to dissect specific vagal circuits in the future or exploring different methods for modulating vagal signaling. However, upon refining, there is a potential for eating disorders and addiction treatments in the future.

error: Content is protected !!