A new look at how the brain works reveals that wiring isn’t everything

How a brain’s anatomical structure relates to its function is one of the most important questions in neuroscience. It explores how physical components, such as neurons and their connections, give rise to complex behaviors and thoughts. A recent study of the brain of the tiny worm C. elegans provides a surprising answer: Structure alone doesn’t explain how the brain works.

C. elegans is often used in neuroscience research because, unlike the incredibly complex human brain, which has billions of connections, the worm has a very simple nervous system with only 302 neurons. A complete, detailed map of every single one of its connections, or brain wiring diagram (connectome), was mapped several years ago, making it ideal for study.

In this research, scientists compared the worm’s physical wiring in the brain to its signaling network, how the signals travel from one neuron to another. First, they used an electron microscope to get a detailed map of the physical connections between its nerve cells. Then, they activated individual neurons with light to create a signaling network and used a technique called calcium imaging to observe which other neurons responded to this stimulation. Finally, they used computer programs to compare the physical wiring map and the signal flow map, identifying any differences and areas of overlap.

The team discovered that the brain’s functional organization differs from its anatomical structure. An analogy is that the brain’s structure is like a city map showing every street. However, the function is more akin to traffic flow, with jams, detours and shortcuts that are not visible on the map. In other words, brain activity does not always follow the predictable pathways of its physical wiring.

“Our results provide new insight into the interplay between brain structure, in the form of a complete synaptic-level connectome, and brain function, in the form of a system-wide causal signal propagation atlas,” wrote the researchers in their paper published in PRX Life. “Collectively, our findings suggest that the effective signaling network…has different network properties from the underlying connectome.”

While the physical and signal maps differed significantly, a few features were preserved. For example, the physical and signal wiring maps of the worm’s feeding organ (pharynx) look the same.

What it means for us

The research may well have been conducted in a tiny worm, but the findings have enormous implications for us. They suggest that scientists need to look beyond the brain’s wiring to fully understand how it works. This may help improve our understanding of neurological disorders like Alzheimer’s and schizophrenia, which involve an interruption in the brain’s ability to process information.

Key driver of pancreatic cancer spread identified

A Cornell-led study has revealed how a deadly form of pancreatic cancer enters the bloodstream, solving a long-standing mystery of how the disease spreads and identifying a promising target for therapy.

Pancreatic ductal adenocarcinoma is among the most lethal cancers, with fewer than 10% of patients surviving five years after diagnosis. Its microenvironment is a dense, fibrotic tissue that acts like armor around the tumor. This barrier makes drug delivery difficult and should, in theory, prevent the tumor from spreading. Yet the cancer metastasizes with striking efficiency—a paradox that has puzzled scientists.

New research published in the journal Molecular Cancer reveals that a biological receptor called ALK7 is responsible, by activating two interconnected pathways that work in tandem. One makes cancer cells more mobile through a process called epithelial-mesenchymal transition, and the other produces enzymes that physically break down the blood vessel walls.

“In other words, ALK7 gives pancreatic cancer cells both the engine to move and the tools to invade,” said Esak Lee, lead author of the study and assistant professor in the Meinig School of Biomedical Engineering in Cornell Engineering.

The research helps resolve conflicting findings about ALK7, which some studies had linked to blocking cancer spread while others had tied it to driving it. Using mouse models of pancreatic cancer and advanced organ-on-chip systems that mimic human blood vessels, the researchers showed that blocking ALK7 significantly slowed metastasis.

The organ-on-chip system, developed in Lee’s lab, simulates the tumor microenvironment and is superior to animal models for studying different stages of the cancer. Using it, the researchers studied whether ALK7 drives the initial invasion of blood vessels or the later stage, when circulating tumor cells exit the bloodstream to form new tumors in organs such as the lungs or liver.

What they found was that cancer cells couldn’t enter blood vessels when ALK7 was inhibited. But when they mimicked a later stage of cancer by placing the cells inside the vessels, they spread quickly, indicating that the timing for treatment is crucial.

“Once we miss this early opportunity to block ALK7 receptors, the cancer cells can freely circulate in the bloodstream and easily seed into other organs,” Lee said. “But if we can inhibit ALK7 at the cancer’s earliest and most vulnerable stage, we might see better outcomes for patients.”

The study also highlights the potential to apply organ-on-chip platforms to study other types of cancers, or how immune cells infiltrate and exit vessels.

“Some cancers have very different microenvironments so, potentially, ALK7 might show different impacts,” Lee said. “I hope this study really opens a new avenue for cancer research.”

Popular keto diet linked to glucose intolerance and fatty liver in mice

Avocado toast with fried cheese as the bread and zucchini noodles in butter-bacon sauce are among the many recipe ideas fueling social media’s beloved high-fat, low-carbohydrate ketogenic, or keto diet. However, scientists have found that while keto can lead to limited weight gain and even weight loss, it does so at the cost of metabolic issues like glucose intolerance.

In a joint study by the University of Utah and Utah Diabetes and Metabolism Research Center, U.S., scientists divided mice into four dietary groups: ketogenic diet (KD, 90% fat), a high-fat diet (HFD, 60% fat), low-fat (LFD), and low-fat moderate protein (LFMP), with varying levels of carbohydrates and proteins.

The mice were allowed to eat freely for up to 36 weeks in males and 44 weeks in females. Test results showed that while the KD supported weight control, it also raised blood cholesterol levels and led to fatty liver in males.

The findings are published in Science Advances.

The diet is named “ketogenic” because it sends the body into ketosis—a metabolic state where the body burns fat as the primary fuel instead of the usual carbohydrates, and, as a result, produces molecules called ketone bodies. The diet isn’t a new food trend, it has been around for nearly 100 years and is well-established for treating drug-resistant epilepsy in children, by reducing seizures in many cases.

The exact way a KD helps control seizures is still unclear, but several ideas have been proposed. Some studies have suggested that KD can stabilize blood glucose (BG), which is beneficial to brain metabolism and neurotransmitter activity, while others highlight the anticonvulsant effects of ketone bodies themselves.

As ketogenic diets have exploded in popularity in recent years, the researchers of this study wanted to explore the long-term metabolic effects of following the diet. Previous studies have explored this arena, but often reported mixed findings or did not directly compare KDs to other diets or account for both sexes.

For this study, the researchers included both male and female mice and carried out regular check-ins to assess the long-term effects of KD on health parameters, including body composition, organ health, and blood profile.

They found that KD protected against excessive weight gain compared to the conventional high-fat diet, but gained more weight than low-fat diets. Long-term KD caused severe hyperlipidemia, meaning there were very high levels of fat in the blood.

KD-fed mice developed severe glucose intolerance because the diet caused a severe impairment in insulin secretion from the pancreas. While female mice on KD seemed fine, the male ones showed signs of liver dysfunction and fatty liver.

The findings made it quite evident that long-term KD can trigger several metabolic disturbances, raising caution against its widespread use as a health-promoting diet.

The researchers note that further studies are needed to explore how variations in fat composition and macronutrient ratios affect metabolic effects. This is especially important for making the ketogenic diet safer for people who rely on it to treat epilepsy.

Drinking any amount of alcohol likely increases dementia risk

Drinking any amount of alcohol likely increases the risk of dementia, suggests the largest combined observational and genetic study to date, published in BMJ Evidence-Based Medicine.

Even light drinking—generally thought to be protective, based on observational studies—is unlikely to lower the risk, which rises in tandem with the quantity of alcohol consumed, the research indicates.

Current thinking suggests that there might be an “optimal dose” of alcohol for brain health, but most of these studies have focused on older people and/or didn’t differentiate between former and lifelong non-drinkers, complicating efforts to infer causality, note the researchers.

To circumnavigate these issues and strengthen the evidence base, the researchers drew on observational data and genetic methods (Mendelian randomization) from two large biological databanks for the entire dose range of alcohol consumption. These were the US Million Veteran Program (MVP), which includes people of European, African, and Latin American ancestry, and the UK Biobank (UKB), which includes people of predominantly European ancestry.

Participants who were aged 56–72 at baseline were monitored from recruitment until their first dementia diagnosis, death, or the date of last follow-up (December 2019 for MVP and January 2022 for UKB), whichever came first. The average monitoring period was four years for the US group, and 12 for the UK group.

Alcohol consumption was derived from questionnaire responses—over 90% of participants said they drank alcohol—and the Alcohol Use Disorders Identification Test (AUDIT-C) clinical screening tool. This screens for hazardous drinking patterns, including the frequency of binge drinking (6 or more drinks at a time).

In all, 559,559 participants from both groups were included in observational analyses, 14,540 of whom developed dementia of any type during the monitoring period: 10,564 in the US group; and 3,976 in the UK group. In total, 48,034 died: 28,738 in the US group and 19,296 in the UK group.

Observational analyses revealed U-shaped associations between alcohol and dementia risk: Compared with light drinkers (fewer than seven drinks a week), a 41% higher risk was observed among non-drinkers and heavy drinkers consuming 40 or more drinks a week, rising to a 51% higher risk among those who were alcohol-dependent.

Mendelian randomization genetic analyses drew on key data from multiple large individual genome-wide association studies (GWAS) of dementia, involving a total of 2.4 million participants to ascertain lifetime (rather than current) genetically predicted risks.

Mendelian randomization leverages genetic data, minimizing the impact of other potentially influential factors, to estimate causal effects: genomic risk for a trait (in this case, alcohol consumption) essentially stands in for the trait itself.

Three genetic measures related to alcohol use were used as different exposures to study the impact of alcohol quantity, as well as problematic and dependent drinking on dementia risk.

These exposures were self-reported weekly drinks (641 independent genetic variants); problematic “risky” drinking (80 genetic variants); and alcohol dependency (66 genetic variants).

Higher genetic risk for all three exposure levels was associated with an increased risk of dementia, with a linear increase in dementia risk the higher the alcohol consumption.

For example, an extra 1–3 drinks a week was associated with a 15% higher risk. A doubling in the genetic risk of alcohol dependency was associated with a 16% increase in dementia risk.

But no U-shaped association was found between alcohol intake and dementia, and no protective effects of low levels of alcohol intake were observed. Instead, dementia risk steadily increased with more genetically predicted drinking.

Additionally, those who went on to develop dementia typically drank less over time in the years preceding their diagnosis, suggesting that reverse causation—whereby early cognitive decline leads to reduced alcohol consumption—underlies the supposed protective effects of alcohol found in previous observational studies, say the researchers.

They acknowledge that a principal limitation of their findings is that the strongest statistical associations were found in people of European ancestry, because of the numbers of participants of this ethnic heritage studied. Mendelian randomization also relies on assumptions that can’t be verified, they add.

Nevertheless, they suggest that their findings “challenge the notion that low levels of alcohol are neuroprotective.”

They conclude, “Our study findings support a detrimental effect of all types of alcohol consumption on dementia risk, with no evidence supporting the previously suggested protective effect of moderate drinking.

“The pattern of reduced alcohol use before dementia diagnosis observed in our study underscores the complexity of inferring causality from observational data, especially in aging populations.

“Our findings highlight the importance of considering reverse causation and residual confounding in studies of alcohol and dementia, and they suggest that reducing alcohol consumption may be an important strategy for dementia prevention.”

A single blood test can measure how different parts of our bodies are aging

Our bodies do not age at a uniform rate. The speed at which one organ ages can differ greatly from another due to a range of lifestyle, environmental and genetic factors. Conventional tests provide a single number for overall biological age, but they do not indicate which parts of the body are aging the fastest.

Now, scientists have developed a single blood test that can measure the aging of 11 different physiological systems, offering a more detailed and personalized look at how our bodies are changing over time.

For years, scientists have relied on epigenetic clocks, which use DNA changes in the blood to estimate biological age. They can give more insights than chronological age but are still not very precise. In a paper published in the journal Nature Aging, scientists describe how they set out to develop a new test for the aging status of different parts of the body.

A more precise look at biological aging

First, the team collected vast amounts of health data, including medical histories, blood tests, grip strength and other physical measurements, from about 7,500 people. They looked for clear links between blood-based biomarkers, such as cholesterol and blood sugar levels, and age-related conditions in several organs.

Then the team used machine learning to connect these findings with DNA methylation patterns from a single blood sample. DNA methylation is the process by which DNA adds or removes chemical tags that switch genes on and off, and which change over time. By training a computer to recognize these patterns, scientists created the Systems Age test, which can calculate a unique score for the biological age of 11 physiological systems. These include the heart, lung, brain, metabolic and immune system.

After training their model, the researchers tested it on blood samples of more than 8,000 people. They found that it was better at predicting specific diseases and aging-related conditions than the single-number biological age scores from older epigenetic clocks. For example, the test’s heart score was a stronger predictor of heart disease than a standard biological age test.

The researchers also found that individuals with the same overall Systems Age could have very different patterns of aging across their physiological systems. Knowing this could help doctors provide more targeted treatments to individuals.

“By providing system-specific scores, Systems Age may better pinpoint which age-related conditions individuals are at risk for,” wrote Morgan Levine, one of the authors of the study.

For example, if your immune system is aging faster than your lungs or heart, your doctor might recommend specific strategies to boost your immunity. Or if your heart score is high, a doctor could suggest lifestyle changes or medications to target cardiovascular health.

Our actions are dictated by ‘autopilot’ not choice, finds study

Habit, not conscious choice, drives most of our actions, according to new research from the University of Surrey, University of South Carolina and Central Queensland University.

The research, published in Psychology & Health, found that two-thirds of our daily behaviors are initiated “on autopilot”, out of habit.

Habits are actions that we are automatically prompted to do when we encounter everyday settings, due to associations that we have learned between those settings and our usual responses to them.

The research also found that 46% of behaviors were both triggered by habit and aligned with conscious intentions, suggesting that people form habits that support their personal goals, and often disrupt habits that conflict with them.

While there have been previous attempts to estimate the prevalence of habits in our day-to-day lives, this study used a new method to capture habits in action. The international research team surveyed 105 participants from the UK and Australia, sending six random prompts to their phones each day for a week, asking them to describe what they were currently doing, and whether it was triggered out of habit or done intentionally.

The study found that 65% of daily behaviors were habitually initiated, meaning people were prompted to do them out of routine rather than making a conscious decision.

Professor Benjamin Gardner, Professor of Psychology at the University of Surrey and co-author of the study, said, “Our research shows that while people may consciously want to do something, the actual initiation and performance of that behavior is often done without thinking, driven by non-conscious habits. This suggests that ‘good’ habits may be a powerful way to make our goals a reality.

“For people who want to break their bad habits, simply telling them to ‘try harder’ isn’t enough. To create lasting change, we must incorporate strategies to help people recognize and disrupt their unwanted habits, and ideally form positive new ones in their place.”

The findings may have broader implications for public health and wellness interventions. The researchers recommend that initiatives designed to help people adopt new behaviors, like exercising or eating healthier, should focus on building new, positive habits.

For example, for someone trying to take up exercise, inconsistent exercising may not be enough. The most effective strategy would involve identifying an everyday situation in which exercise can realistically be done—for example, at a certain time of day, or following a regular event, like leaving work—and consistently doing some exercise in that situation.

Likewise, to break a bad habit like smoking, simply wanting to quit may not cut it. The most effective strategies would involve disrupting triggers (e.g. avoiding places where they used to smoke) and creating new routines (e.g. chewing gum after a meal instead of having a cigarette).

Dr. Amanda Rebar, Associate Professor at the University of South Carolina and lead author of the study said, “People like to think of themselves as rational decision makers, who think carefully about what to do before they do it.

“However, much of our repetitive behavior is undertaken with minimal forethought and is instead generated automatically, by habit.”

Dr. Grace Vincent, Sleep Scientist and Associate Professor at Central Queensland University and co-author of the study, said the findings offer hope to people trying to develop healthier lifestyles. “Our study shows that two-thirds of what people do each day is sparked by habit, and most of the time those habits are also aligned with our intentions.

“This means that if we set out to create a positive habit, whether that’s around better sleep hygiene, or nutrition, or general well-being improvements, we can rely on an internal ‘autopilot’ to take over and help us maintain those habits.

“Unfortunately, not all habits are created equal. Exercise was the exception in our findings, as it was often triggered by habit, but less likely to be achieved purely on autopilot, compared to other behaviors.”

Novo Nordisk study finds greater waist reduction with higher semaglutide dosage

A Novo Nordisk-designed trial, conducted at 68 international hospitals, specialist clinics, and medical centers with investigators from University of Texas Southwestern Medical Center and other institutions, reports greater reductions with once-weekly semaglutide 7.2 mg than placebo in bodyweight, waist circumference, and HbA1c over 72 weeks.

Obesity contributes to the development and progression of type 2 diabetes, raising risks for dyslipidemia, hypertension, and cardiovascular disease. GLP-1 receptors were designed to reduce blood sugar (HbA1c) and body weight in adults with type 2 diabetes.

Regulators in the U.S. and EU currently authorize a 2.4 mg once-weekly semaglutide dose (such as Wegovy) for weight management in adults with obesity or overweight with at least one complication.

In the study, titled “Once-weekly semaglutide 7.2 mg in adults with obesity and type 2 diabetes (STEP UP T2D): a randomised, controlled, phase 3b trial,” published in The Lancet Diabetes & Endocrinology, Novo Nordisk researchers designed a randomized, double-blind, controlled, three-arm, parallel-group trial to investigate the efficacy and safety of a 7.2 mg maintenance dose of once-weekly subcutaneous semaglutide in adults with obesity and type 2 diabetes.

Enrollment reached 512 adults at 68 hospitals, specialist clinics, and medical centers across Bulgaria, Canada, Hungary, Poland, Portugal, Slovakia, South Africa, and the U.S. The baseline profile included a mean age of 56 years, mean bodyweight 110.1 kg, mean BMI 38.6 kg/m², mean HbA1c 8.1%, and 51.8% female.

Randomization assigned participants 3:1:1 to a once-weekly dose of semaglutide at 7.2 mg or 2.4 mg, or to a placebo in a double-blind, three-arm, parallel-group design. Treatment ran 72 weeks with counseling targeting an approximately 500 kcal-per-day deficit and at least 150 minutes per week of physical activity. Dosing started at 0.25 mg weekly and escalated every four weeks to 2.4 mg by week 16, with the 7.2 mg arm advancing to the active dose at week 20.

Co-primary endpoints were percentage change in bodyweight and achievement of at least 5% bodyweight reduction versus placebo. Confirmatory secondary endpoints included 10%, 15%, and 20% weight-loss thresholds, waist circumference, and HbA1c.

Semaglutide reduced mean body weight by 13.2% with doses of 7.2 mg versus 10.4% with doses of 2.4 mg and 3.9% with placebo over 72 weeks. While the 7.2 mg group met the primary endpoints of a 5% change versus placebo, so did the already-approved 2.4 mg dosage under the exploratory conditions.

Waist circumference declined by a mean of 12.3 cm with 7.2 mg dosing versus 5.8 cm with placebo, and 10.7 cm with 2.4 mg dosing. The HbA1c decrease in the 7.2 versus 2.4 mg group was not statistically significant, and both were lower than placebo.

Gastrointestinal events occurred in 53.1% with 7.2 mg doses, 51.5% with 2.4 mg doses, and 25.5% with placebo, with serious adverse events in 9.1%, 8.7%, and 8.8%. Dysesthesia, an unpleasant or painful sensation, was more frequent with 7.2 mg doses at 18.9% versus 4.9% with 2.4 mg doses and 0% with placebo. Adverse events leading to dose reduction were higher in the 7.2 mg group (20.2%) versus the 2.4 mg group (13.6%) and placebo (2.0%). There was one death possibly related to the study drug at the 7.2 mg dose.

Exploratory analyses suggested greater mean bodyweight reduction in the 7.2 mg group versus the 2.4 mg group, an estimated treatment difference of −2.8% with a 95% CI −4.7 to −0.9. The study design does not include how much of the weight loss was from fat versus lean mass (muscle, organ and bone tissue.)

Authors conclude that semaglutide in a dose of 7.2 mg was superior to placebo for reducing bodyweight, waist circumference, and HbA1c in adults with obesity and type 2 diabetes. Safety and tolerability, within the limited scope of the trial, were similar to those of 2.4 mg group apart from a higher frequency of dysesthesia, higher necessity of dose reductions, and possibly death. Authors state that tripling the dose from 2.4 mg to 7.2 mg might provide additional clinical benefit in bodyweight reduction and suggest further exploration.

Obviously, any drug company-funded and designed study that suggests benefits from a tripling of the approved dosage of their product should be met with an equally increased dosage of skepticism.

An independent inquiry into the results could more rigorously compare the triple dose with the original dose and investigate safety and tolerability not comprehensively characterized in this study, specifically the ratio of weight lost from fat versus lean mass, as well as pancreatic events, liver-enzyme monitoring, arrhythmia surveillance, and retinopathy surveillance.

Largest genetic study to date identifies 13 new DNA regions linked to dyslexia

Dyslexia is a neurodevelopmental condition estimated to affect between 5–10% of people living in most countries, irrespective of their educational and cultural background. Dyslexic individuals experience persistent difficulties with reading and writing, often struggling to identify words and spell them correctly.

Past studies with twins suggest that dyslexia is in great part heritable, meaning that its emergence is partly influenced by genetic factors inherited from parents and grandparents. However, the exact genetic variants (i.e., small differences in DNA sequences) linked to dyslexia have not yet been clearly delineated.

Researchers at University of Edinburgh, the Max Planck Institute for Psycholinguistics and various other institutes recently carried out the largest genome-wide association study to date exploring the genetic underpinnings of dyslexia. Their paper, published in Translational Psychiatry, identifies several previously unknown genetic loci that were found to be linked to an increased likelihood of experiencing dyslexia.

“This research was motivated by the longstanding challenge of identifying the genetic basis of dyslexia—a common and often inherited learning difference, characterized by difficulties with reading, spelling or writing,” Hayley Mountford, Research Fellow at University of Edinburgh’s School of Psychology, told Medical Xpress.

“Although prior studies had revealed some genetic associations, research into dyslexia is still far behind that of autism or attention deficit hyperactivity disorder (ADHD), and the biological mechanisms remained unclear. The recent availability of summary statistics from two large genome-wide association studies (GWAS) allowed us to combine them in a more powerful meta-analysis.”

The primary goals of the recent study by Mountford and her colleagues were to uncover new genes that are linked with dyslexia, while also gaining new insight into the biological basis of differences in reading ability. In addition, the researchers wished to explore the possibility that people’s genetic scores (also known as a polygenic index) could predict reading difficulties.

As part of their study, they also tried to determine whether there is a recent evolutionary selection in dyslexia. Finally, the team hoped to reduce the stigma associated with dyslexia by shedding new light on its underlying biological processes.

“We started by bringing together two large genetic datasets from previous studies: one from the GenLang Consortium, which includes detailed reading ability test data, and another from 23andMe, which included more than 50,000 people reporting a dyslexia diagnosis. In total, we analyzed genetic data from more than 1.2 million people,” explained Mountford.

“We used a method called MTAG (Multi-Trait Analysis of GWAS), which allows you to jointly analyze related traits, such as reading ability and dyslexia diagnosis, to detect more genetic associations than could be found by analyzing them separately.”

After they identified the associated genetic variants, the researchers tried to better understand their contribution to the biology of dyslexia using bioinformatic tools. In addition, they created a polygenic index, which estimates the genetic risk that an individual will develop a specific condition.

Mountford and her colleagues subsequently tried to determine how well these estimates predicted a group of children’s reading performance. Finally, they examined ancient DNA collected over the past 15,000 years to explore how these genes have evolved over time.

“Our study represents the largest and most powerful genetic analysis of dyslexia to date,” said Mountford. “We identified 80 regions associated with dyslexia, including 36 regions which were not previously reported as significant. Of these 36 regions, 13 were entirely novel with no prior suggestive association with dyslexia. This significantly expands our understanding of the genetic architecture of reading-related traits.”

Interestingly, the researchers found that many of the genes they uncovered are active in brain regions that are known to still be developing at the early stages of life. In addition, they appeared to support signaling and the establishment of communication points (i.e., synapses) between neurons.

“Our polygenic index was able to explain up to 4.7% of the variance in reading ability in an independent sample, and while modest, this is a meaningful step toward potential early identification of reading difficulties,” said Mountford. “We found no evidence of recent evolutionary selection for or against dyslexia associated genes, suggesting it has not been affected by any major social or societal changes that have taken place in the past 15,000 years in northern Europe.”

The recent work by Mountford and her colleagues greatly contributes to the understanding of dyslexia and reading abilities in general, shedding new light on their biological underpinnings. The researchers were able to identify 13 new genetic loci linked to dyslexia, which were implicated in early brain development processes.

As part of their next studies, they plan to conduct cross-trait genetic analyses. These analyses would, for instance, allow them to learn whether some dyslexia-related genes overlap with genes associated with ADHD, language impairments and other neurodevelopmental conditions, while also identifying condition-specific genes.

“We also plan to explore how genetic risk for dyslexia influences outcomes across the lifespan, including education, career, and mental health,” added Mountford.

“Concurrently, we will try to enhance polygenic scores by incorporating more diverse samples and integrating environmental factors like early education and home literacy environments. Finally, we plan to conduct a follow-up study investigating how the newly identified genes influence brain development, using cellular models and imaging genetics.”

Alpha cells moonlight as secret GLP-1 factories: Hormone shift could transform type 2 diabetes treatment

A new study from Duke University School of Medicine is challenging long-standing views on blood sugar regulation—and pointing to a surprising new ally in the fight against type 2 diabetes.

Published in Science Advances, the research reveals that pancreatic alpha cells, once thought to only produce glucagon—a hormone that raises blood sugar to maintain energy when fasting or exercising—also generate GLP-1, a powerful hormone that boosts insulin and helps regulate glucose. GLP-1 is the same hormone mimicked by blockbuster drugs like Ozempic and Mounjaro.

Using mass spectrometry, Duke researchers found that human alpha cells may naturally produce far more bioactive GLP-1 than previously believed.

Led by Duke scientist Jonathan Campbell, Ph.D., the team of obesity and diabetes researchers analyzed pancreatic tissue from both mice and humans across a range of ages, body weights, and diabetes statuses. They found that human pancreatic tissue produces much higher levels of bioactive GLP-1 and that this production is directly linked to insulin secretion.

“This research shows that alpha cells are more flexible than we imagined,” said Campbell, an associate professor in the Division of Endocrinology in the Department of Medicine and a member of the Duke Molecular Physiology Institute. “They can adjust their hormone output to support beta cells and maintain blood sugar balance.”

This flexibility could change how we think about treating type 2 diabetes, where beta cells in the pancreas can’t make enough insulin to keep blood sugar at a healthy level. By boosting the body’s own GLP-1 production, it may offer a more natural way to support insulin and manage blood sugar.

Switching gears

In mouse studies, when scientists blocked glucagon production, they expected insulin levels to drop. Instead, alpha cells switched gears—ramping up GLP-1 production, improving glucose control, and triggering stronger insulin release.

“We thought that removing glucagon would impair insulin secretion by disrupting alpha-to-beta cell signaling,” Campbell said. “Instead, it improved it. GLP-1 took over, and it turns out, it’s an even better stimulator of insulin than glucagon.”

To test this further, researchers manipulated two enzymes: PC2, which drives glucagon production, and PC1, which produces GLP-1. Blocking PC2 boosted PC1 activity and improved glucose control. But when both enzymes were removed, insulin secretion dropped and blood sugar spiked—confirming the critical role of GLP-1.

Implications for diabetes treatment

While GLP-1 is typically made in the gut, the study confirms that alpha cells in the pancreas can also release GLP-1 into the bloodstream after eating, helping to lower blood sugar by increasing insulin and reducing glucagon levels.

Common metabolic stressors, like a high-fat diet, can increase GLP-1 production in alpha cells—but only modestly. That opens the door to future research: If scientists can find ways to safely boost GLP-1 output from alpha cells, they may be able to naturally enhance insulin secretion in people with diabetes.

But measuring GLP-1 accurately hasn’t been easy. The team developed a high-specificity mass spectrometry assay that detects only the bioactive form of GLP-1—the version that actually stimulates insulin—not the inactive fragments that often muddy results.

“This discovery shows that the body has a built-in backup plan,” Campbell said. “GLP-1 is simply a much more powerful signal for beta cells than glucagon. The ability to switch from glucagon to GLP-1 in times of metabolic stress may be a critical way the body maintains blood sugar control.”

Inhaling cannabis may greatly increase your risk of getting asthma

If you’re looking to reduce your chances of developing lung disease, say experts at UC San Francisco, then it may be smart to avoid inhaling cannabis.

A new study in the Journal of General Internal Medicine found that inhaling marijuana every day is associated with a 44% increased chance of developing asthma. It also increased the odds of developing chronic obstructive pulmonary disease (COPD) by 27%.

The COPD risk may be understated, since the disease takes decades to develop, and the researchers did not have detailed information on how long people in the study had been using cannabis.

In defining the concept of inhaling, the researchers included smoking, vaping, and so-called “dabbing,” which involves breathing in the vapors of concentrated marijuana.

The study found an association between elevated risks to a person’s lungs and doing any of those things with cannabis even for those who had never smoked cigarettes.

For this group, inhaling marijuana every day was linked to a 51% increased likelihood of developing asthma. The association with COPD was also elevated, but it was not statistically significant.

The study is the largest yet to examine the association between inhaling cannabis and risks to respiratory health among people who have not smoked cigarettes. Of the 380,000 adult participants, nearly 222,000 had never smoked tobacco. The data comes from the Behavioral Risk Factor Surveillance System, a national survey by the Centers for Disease Control and Prevention (CDC).

Experts say the broad legalization of marijuana across much of the country and the perception that it is healthier than tobacco has led people to minimize the risks.

“The message about smoking tobacco being bad for you has gotten out there, but for cannabis, it’s much less clear,” said Alison Rustagi, MD, Ph.D., assistant professor at UCSF and first author of the paper.

“If people are looking to reduce their likelihood of developing a chronic lung disease, they should not start using cannabis,” she said. “And if they already smoke cannabis, they should do it less often.”

error: Content is protected !!