A major pregnancy scare collapses: Tylenol shows no autism risk in more than 1.5 million children

Acetaminophen, which also goes by names like paracetamol or Tylenol, is a common over-the-counter pain reliever and fever reducer. It is often prescribed during pregnancy to help with mild to moderate pain. Recently, there has been a lot of discourse about its safety. Claims have been made suggesting that taking acetaminophen during pregnancy may increase the likelihood of autism in children.

A large study from Denmark adds clarity to this debate, finding no increased risk of autism in children exposed to acetaminophen before birth. These results were consistent across both the general population analysis and sibling-matched comparisons, and did not vary with the timing or dosage of exposure. The findings are published in JAMA Pediatrics.

Risk or no risk

Public concern over whether taking acetaminophen during pregnancy could increase autism risk quickly moved into the spotlight—dominating news channels, newspapers, and social media. The debate became so intense that in September 2025, the U.S. Food and Drug Administration advised clinicians to consider limiting its use for routine low-grade fevers during pregnancy.

Yet the scientific evidence remained conflicted—some studies have reported a small risk, while others have found no link at all. A large study from Sweden initially reported a slight rise in autism risk when analyzing the general population. However, when the same researchers compared siblings within the same families—a method that accounts for shared genetic and environmental factors—the apparent link disappeared.

A sibling-matched analysis is a research method that scientists use to compare siblings from the same family to better understand how a particular exposure affects an outcome. Since siblings share many of the same genes and a common home environment, this approach helps account for hidden factors that might otherwise influence the results.

In search of more definitive evidence, the researchers carried out a nationwide cohort study in Denmark. They examined all children born from single pregnancies between 1997 and 2022, using official national health records to follow more than 1.5 million children over time. The analysis focused only on children who were alive at age one and excluded cases with missing data or conditions already known to be linked to autism.

The researchers found that taking acetaminophen during pregnancy is not linked to an increased risk of autism in children. The findings were the same for the general population and siblings comparisons. The adjusted hazard ratio—a measure of relative risk between two groups—was close to 1, indicating no increased risk.

The study closely examined both the amount of medication taken—low, medium, or high—and the stage of pregnancy during which it was used, covering each trimester. Across all these variations, the team found no evidence of an increased risk.

These findings could help ease anxiety among parents while also giving medical practitioners the evidence they need to clearly explain to patients and support their informed decision-making.

Your phone already sees the warning signs: Sleep, movement and mood data can spot depression early

Depression is among the most widespread mental health disorders worldwide, affecting an estimated 1 in 20 people. It is characterized by persistent sadness, hopelessness, disrupted sleep patterns, changes in appetite and a loss of interest in everyday activities.

While there are now various treatments for depression, including different types of antidepressant medications and psychotherapeutic approaches, not all depressed individuals have access to these resources or benefit from them. Reliably detecting the first signs of depression could be highly advantageous, as it could ultimately allow mental health services to intervene early, before symptoms worsen and the disorder becomes debilitating.

The analysis of data collected by smartphones, smartwatches and other wearable devices could potentially help to detect some early signs of depression, such as a lower mood, increased stress levels and behavioral changes. While various past studies explored the potential of mobile technologies for the early detection of depressive symptoms, the factors influencing the effectiveness of these tools remain poorly understood.

Researchers at Ghent University recently set out to better understand what contributes to the effectiveness of these technology-based solutions, by reviewing earlier papers that assessed their potential. The team’s review paper, published in Nature Mental Health, pinpoints types of data that are particularly helpful for detecting signs of depression, while also identifying computational models that appear to be the most effective for this specific application.

“Early detection of depressive symptom changes is vital for timely interventions,” Yannick Vander Zwalmen and Matthias Maerevoet wrote in their paper. “Mobile and wearable technologies enable continuous, unobtrusive monitoring of behavioral, psychological and physiological data, offering new possibilities for digital phenotyping and just-in-time prediction of depression. This scoping review synthesized findings from 52 studies to identify commonly used features, evaluate their predictive value and assess methodological approaches.”

Using smartphone data to predict mood changes

Vander Zwalmen, Maerevoet and their colleagues reviewed 52 past research studies that focused on predicting early signs of depression. These studies collected data using smartphones or wearable devices, then analyzed it with computational models to predict early signs of depression.

The data collected ranged from movement or location-related information, sleep patterns, physical activity patterns, communication patterns (i.e., how many calls users made and how many messages they sent or received), heart rate variability (HRV) and self-reported mood ratings. By reviewing the findings of earlier studies, the team tried to identify the data patterns that were most closely linked to early symptoms of depression.

“Features such as time spent at home, sleep variability and reduced mobility were strongly associated with depressive symptoms,” wrote the authors. “Combining physiological, behavioral and self-report data enhanced predictive performance. Personalized models and anomaly detection approaches outperformed generalized ones in predicting individual symptom changes.”

The researchers’ analyses revealed that depression symptoms were typically linked with irregular sleep patterns, a reduction in movement, little physical activity and a self-reported bad mood. In addition, models that were adjusted to consider a user’s unique habits and average biological signals appeared to predict early signs of depression better than general models.

Towards better mental health monitoring tools

Overall, this review study confirmed the potential of data collected by portable and wearable devices for the prediction of early depressive symptoms. In the future, it could guide the development of new mental health apps or other technological tools that detect signs of depression and share useful resources or the contacts of local mental health services with users.

“Mobile and wearable data show strong potential for just-in-time depression prediction,” wrote Vander Zwalmen, Maerevoet and their colleagues. “Future research should emphasize new features, diverse populations and personalized models to improve accuracy and real-world applicability.”

Agrovoltaic systems can save water, generating energy and making tomato cultivation more sustainable at the same time

Researchers from the University of Seville (US) and the Polytechnic University of Madrid (UPM) have demonstrated that it is possible to grow tomatoes and generate solar energy simultaneously, a key strategy for tackling global water scarcity. The study, carried out in Madrid and Seville during the spring of 2024, evaluated the use of agrovoltaic systems and regulated deficit irrigation to optimize water resources in tomato cultivation. The results show that, although using less water reduces the volume of the harvest, the overall outcome is a more efficient and sustainable process.

This innovative combination aims to reduce the plants’ evaporative demand through the shade provided by photovoltaic panels, enabling a more efficient use of land and water. The research compared three irrigation methods: a control group with full irrigation, a regulated deficit irrigation (RDI) system based on the plant’s water status, and an agrovoltaic (AG) system that applied the same water restriction under solar panels. The study measured variables such as leaf water potential and gas exchange to monitor plant stress at different growth stages. The results indicate that, although the shade from the panels reduces available radiation, the design of the system permits adequate plant development to be maintained at most stages of the crop cycle.

One of the most notable findings is that the deficit irrigation strategy reduced water consumption by approximately 50% compared to traditional irrigation. However, this drastic reduction in water led to a yield decrease of around 20% in the RDI treatment, attributed mainly to severe water stress conditions during the ripening phase. Despite this drop in total tomato production, irrigation water productivity increased significantly in the Seville treatments, demonstrating that more fruit can be obtained for every drop of water invested.

Furthermore, the overall success of the agrovoltaic system was validated by the Land Equivalent Ratio (LER), which combines the efficiency of agricultural and electricity production. The values obtained—1.54 in Madrid and 1.67 in Seville—confirm that combined production is far more efficient than growing tomatoes and generating energy on separate plots. This implies that, although tomato yield decreases under the panels, the system’s profitability and sustainability increase thanks to the generation of clean energy in the same space.

In conclusion, the study highlights that agrovoltaics is a promising tool for the agriculture of the future, although it requires more precise irrigation management to avoid excessive stress. The researchers suggest that combining plant measurements with soil moisture sensors could further optimize these systems. This advance points to the sustainable dual use of land, offering a viable solution to the challenges of climate change and the energy transition.

The results are published in Agricultural Water Management.

From lockdown to the lab: Researcher develops ‘decoy molecule’ to slow down coronavirus

While the Netherlands was in lockdown because of the coronavirus, Ph.D. candidate Koen Rijpkema began his research into the same virus. In the lab, he developed molecules that can inhibit an important viral enzyme.

Rijpkema started his Ph.D. in the middle of the pandemic, complete with lockdowns and curfews. “I lived with seven other people, plus visiting partners. At one point, I was in quarantine more often than not,” he says. Working from home was hardly an option, because his research depends heavily on lab experiments.

The coronavirus did, however, offer a highly relevant research topic. “I really wanted to do a lot of synthesis: designing and making new molecules, ideally with the same supervisor I had during my master’s project. He suggested looking for molecules that could inhibit the coronavirus. It was a timely and meaningful project.”

How to trick a virus

Rijpkema focused on a specific part of the coronavirus: an enzyme that suppresses the immune system. Normally, the immune system responds to a virus by releasing signaling molecules that “raise the alarm” in the body. But this viral enzyme—called Mac1—removes part of such a signaling molecule, disrupting the signal and making it harder for the immune system to detect the infection.

The solution was to mislead the enzyme. “We make molecules that resemble the part of the signaling molecule that Mac1 normally binds to. But our molecules bind much more strongly. In this way, we keep the enzyme busy with decoy molecules, so it can no longer bind to the real signaling molecules.” This allows the immune system to respond more effectively to the virus.

From scattered puzzle pieces to one strong molecule

But how do you actually design a molecule that fits? In this case, Rijpkema could not rely on computer models. “We did try,” he says, “but so little was known about the enzyme at the time that the models did not give any clear direction.”

Instead, it came down to trial and error. For each part of the molecule, Rijpkema and his colleagues had to design a synthetic route: a series of chemical reactions starting from simple building blocks that together produce the desired molecule.

They then tested whether it worked, and adjusted it if it didn’t. “For the first two years, I basically only did things that didn’t work.” But this process helped him discover which changes improved the molecule. “In the end, we combined all the successful parts into one so-called ‘super molecule’ that binds very strongly to the enzyme.”

‘Another group just beat us to it’

Alongside challenging research, Rijpkema also faced tough competition. Just as he was ready to publish his first results, another study with similar findings appeared. “After two and a half years, we finally had something that worked, and then another group just beat us to it,” he says.

Rather than giving up, the team shifted their focus. “We emphasized not the biological data, but the way we had made our molecules,” he explains. “That was slightly different and more elegant than what the other group had done. It was a good learning moment. This is sometimes how science goes, but you have to be flexible and keep going.”

Decoy molecule as a stepping stone toward new medicine

The new decoy molecule is not a medicine in itself, but it is an important step forward. The molecules Rijpkema developed mainly help scientists better understand how the enzyme works.

That knowledge is crucial for pharmaceutical companies, which can use it to develop real treatments in the future. “We do fundamental research,” Rijpkema says. “But without that foundation, you cannot develop targeted medicines.”

Rijpkema will defend his Ph.D. thesis, “Synthesis of ADP-ribose Analogues,” on April 16. His supervisors are Dr. Dmitri Filippov and Professor Jeroen Codée.

Merck’s PD-1/VEGF data star in stacked lineup of AACR ‘26 data reveals

The Merck update, which will shed light on a $588 million bet to succeed Keytruda, is part of a roster of presentations that could shape the future of ADCs, protein degraders and KRAS-targeted therapies.

The collective attention of the oncology R&D community is turning toward San Diego. Beginning Friday, the city will host an American Association for Cancer Research annual meeting featuring updates from Merck, Amgen, Revolution Medicines and more with the potential to reshape cancer care and the priorities of R&D teams.

As was the case last year, one of AACR 2026’s most closely watched clinical data drops belongs to Merck. The drugmaker and its collaborators will share preliminary results from the first-in-human study of MK-2010, the PD-1/VEGF bispecific antibody that Merck licensed from Shanghai-based LaNova Medicines for $588 million upfront in 2024.

Merck struck the deal after another PD-1/VEGF bispecific, Akeso and Summit Therapeutics’ ivonescimab, beat its blockbuster checkpoint inhibitor Keytruda in a head-to-head clinical trial. The MK-2010 readout will provide an early look at Merck’s chances of competing with ivonescimab, as well as Pfizer and partners BioNTech and Bristol Myers Squibb, in the race to deliver the next backbone oncology treatment.

Jia Luo, a medical oncologist at Dana-Farber Cancer Institute, is closely watching the RAS space. In an email to BioSpace, she listed updates on the “promising” allele-specific RAS inhibitors zoldonrasib and elisrasib in KRAS-mutated non-small cell lung cancer (NSCLC) among the presentations she’s looking forward to at AACR 2026.

Revolution Medicines—which already made the biggest splash of this week with its pancreatic cancer data—is sharing preliminary safety and clinical activity data on another asset, zoldonrasib, which targets the G12D mutation in KRAS-dependent cancers. The biotech reported a 61% objective response rate in a Phase 1 NSCLC trial last year, though that figure included unconfirmed responses. Merck reportedly engaged in talks early this year to acquire Revolution for about $30 billion.

Meanwhile, elisrasib is a KRAS G12C inhibitor in development at D3 Bio. At AACR 2025, the Chinese biotech reported Phase 2 data showing a 30% response rate in 20 NSCLC patients who progressed on other KRAS G12C drugs such as Amgen’s Lumakras and Bristol Myers Squibb’s Krazati. This year’s update will include Phase 1/2 monotherapy data on elisrasib in advanced NSCLC patients previously treated with or without a KRAS G12C inhibitor.

Luo’s excitement for the zoldonrasib and elisrasib readouts reflects her expectation that “there will likely be multiple approved targeted treatment options for RAS [mutant] cancers in the coming years.”

Zoldonrasib and elisrasib also reflect the ongoing use of small molecules in oncology. Companies are still using the well-established modality in exciting new ways, Allan Jordan, vice president of oncology drug discovery at Sygnature Discovery, told BioSpace via email. Jordan named intrinsically disordered proteins, transcription factors and mRNA as challenging targets that researchers are tackling with small molecules.

Drugs targeting enzyme classes such as helicases “remain a vibrant field” for small molecules, Jordan said. Amgen and Eikon Therapeutics, which raised $381 million in an IPO this year, are among the companies sharing data on helicase inhibitors at AACR 2026.

ADCs enter new era

KRAS-targeted therapies made Lillian Siu’s top areas to watch at AACR 2026, but in an email to BioSpace, the Princess Margaret Cancer Centre medical oncologist also named modalities such as antibody-drug conjugates (ADCs) among the updates she is most looking forward to. Timothy Yap, a medical oncologist at the University of Texas MD Anderson Cancer Center, also highlighted ADCs in his email to BioSpace.

The clinical trials plenary on April 19 will feature data on ADCs including CSPC Pharmaceutical’s EGFR candidate, Qilu Pharmaceutical’s claudin 6 prospect and a B7-H3-targeted asset that GSK licensed from Hansoh Pharma for $185 million upfront. Those candidates are traditional ADCs, with each featuring a targeting antibody conjugated to a cytotoxic topoisomerase I inhibitor.

AACR 2026 will also feature presentations on candidates that diverge from the standard ADC design. Astellas is presenting preclinical data on an ADC that carries two payloads to cells expressing TROP2, the receptor targeted by Gilead’s Trodelvy and AstraZeneca and Daiichi Sankyo’s Datroway. Duality Biologics, which has deals with BioNTech and GSK, is among the other companies sharing data on a dual-payload ADC.

The types of payloads are changing, too. Sygnature’s Jordan said it is interesting to finally see the ADC field move “away from traditional cytotoxic payloads toward more cancer-selective payloads which have the potential to improve tolerability and patient benefit.”

Elsewhere, multiple companies are sharing data on degrader-antibody conjugates (DACs), which deliver molecules that drive protein degradation rather than the cytotoxic payloads used in ADCs. Roche recently struck an up to $1 billion DAC deal with C4 Therapeutics, joining rivals including BMS in the emerging field.

At AACR 2026, Helioson Pharmaceutical is sharing late-breaking research on a DAC designed to degrade IKZF1/3, transcription factors needed for the growth and survival of multiple myeloma cells. CSPC, which like Helioson is based in China, is also presenting data on a DAC targeting IKZF1/3. Orum Therapeutics, a partner of BMS and Vertex, is among the other companies showcasing DAC data at the event.

The various projects illustrate the potential benefits of DACs, with CSPC using a CD38 antibody to trigger a dual mechanism of action and Orum leveraging the modality to achieve a wider therapeutic window than is possible with standard GSPT1 degraders. However, traditional degraders remain viable and the field continues to evolve, with Jordan naming the rise of glues targeting E3 ligases other than cereblon as a trend to watch.

ADCs, DACs and protein degraders are just some of the modalities that will be showcased at the event. Multispecific antibodies and novel induced proximity strategies are among the other areas Yap will be keeping an eye out for, while Siu will be watching for data on CAR T cell therapy and immunotherapy in precursor malignancies.

AACR runs from April 17 to April 22. Stay tuned to BioSpace for coverage of the most notable presentations.

Trump nominates new CDC director, appoints various leaders to embattled agency

Erica Schwartz, a former deputy surgeon general and member of the U.S. military, will take over from what has amounted to a rapid succession of CDC leaders over the past year.
Jay Bhattacharya has one less hat to wear after President Donald Trump announced he is nominating Erica Schwartz as the next director of the CDC.

The president nominated Schwartz, a former deputy surgeon general in Trump’s first administration, to the Senate-confirmable position in a post on his social media network, Truth Social, on Thursday afternoon.

“She is a STAR,” Trump said after listing Schwartz’s professional and educational accolades.

Schwartz—who served 24 years in the U.S. Public Health Service Commissioned Corps and as a rear admiral in the Coast Guard, and also holds a medical degree from Brown University and a law degree from the University of Maryland—becomes the fourth person to lead the CDC in the past year.

Susan Monarez, the first CDC director to be confirmed by the Senate under a 2023 law, was fired less than a month after clashing with Health Secretary Robert F. Kennedy Jr last year. In an opinion published in The Wall Street Journal in September, Monarez said she had been asked to “preapprove” recommendations from Kennedy’s newly instated vaccine advisors regarding COVID-19 immunization guidelines—which she refused to do.

The top CDC role was then filled in an acting capacity by Health and Human Services Deputy Secretary Jim O’Neill until February, when National Institutes of Health head Bhattacharya took over as acting director.

In the same social post, Trump appointed a cadre of nominees to the CDC leadership team. FDA Principal Deputy Commissioner Sara Brenner will be a public health adviser to Kennedy, while Jennifer Shuford, current commissioner of the Texas health department, will serve as Schwartz’s deputy director and the agency’s chief medical officer.

Shuford will step into the CMO role once held by Debra Houry, who resigned after Monarez was forced out. Both Monarez and Houry testified in front of the Senate Committee on Health, Education, Labor and Pensions in September 2025, during which Houry said that “Trust and transparency have been broken. The problem is not too much science but too little.”

Sean Slovenski, a former executive at Walmart and Humana, will become a deputy director and chief operating officer at the CDC. Slovenski holds a BA in Health and Physical Education, according to his LinkedIn profile.

“These Highly Respected Doctors of Medicine have the knowledge, experience, and TOP degrees to restore the GOLD STANDARD OF SCIENCE at the CDC, which was an absolute disaster focused on ‘mandates’ under Sleepy Joe,” Trump said in his post, referring to former President Joe Biden.

Meanwhile, Kennedy also alluded to the new team during a congressional budget hearing on Thursday.

“We’re bringing in an extraordinary team. … The team has been leaked, and it’s gotten applause from both Republicans and Democrats,” the secretary said while testifying in front of the House Appropriations subcommittee on health, CNN reported. “I think this new team is really going to be able to revolutionize CDC and get it back on track and get it doing the job that it does better than any other health agency in the world.”

Air pollution associated with increased migraine activity

Air pollution is associated with increased migraine activity, according to a study published in Neurology. Both short-term and cumulative exposure to air pollution as well as climate factors such as heat and humidity were associated with increased migraine activity.

The study does not prove that air pollution causes migraine attacks; it only shows an association.

“These results help us to better understand how and when migraine attacks occur,” said study author Ido Peles, MD, of Ben-Gurion University of the Negev in Be’er Sheva, Israel.

“They suggest that for people who have a susceptibility to migraine to begin with, environmental factors may play two roles: intermediate-term factors such as heat and humidity may modify the risk for attacks, while short-term factors such as spikes in pollution levels may trigger attacks.”

How the long-term study was done

The study involved 7,032 people with migraine who lived in Be’er Sheva in the Negev desert and were followed for an average of 10 years.

Researchers looked at daily exposure to air pollution from traffic, industry and dust storms, as well as weather conditions. Then they looked at how often and when people had to visit the hospital or a primary care office with an acute migraine and compared that to the pollution and weather conditions that day and up to seven days earlier, since pollution effects may take a few days to affect the body.

They also looked at cumulative exposure to air pollution and migraines. As another measure of migraine activity, researchers checked pharmacy records to see how many doses of the migraine medications, called triptans, participants needed.

What the migraine data revealed

During the study, 2,215 people, or 32%, had at least one visit to the hospital or clinic for acute migraine. A total of 47% of the people had purchased triptan medications during the study, with average use at two tablets per month and 2.3% of people using 10 or more tablets per month.

The researchers found an association between air pollution and visits to the hospital or clinic for migraine.

On the day with the highest number of visits to the hospital or clinic, air pollution levels were elevated compared to the average amount over the study period. On that day, the level of particulate matter 10, or PM10, which includes dust, was 119.9 microns per square meter (µm/m3), compared to an average of 57.9 during the study.

For PM2.5, which includes particles from motor vehicle exhaust and the burning of fuels from power plants and other industries, the level on that day was 27.3 µm/m3, compared to an average of 22.3 during the study. For nitrogen dioxide, or NO2, a gas mostly from traffic emissions, the level on that day was 11.2 parts per billion, compared to an average of 8.7.

The day with the fewest visits to the hospital or clinic also had lower than average pollution levels.

Pollution types and relative risks

After adjusting for other factors that could affect the risk of migraine attacks, such as sex and socioeconomic status, researchers found that people with short-term exposure to high levels of NO2 were 41% more likely to go to the hospital or clinic for migraine than people not exposed to high levels.

People exposed to high levels of solar radiation, or ultraviolet (UV) rays from the sun, were 23% more likely to seek help for migraine than those not exposed to high levels.

People with cumulative exposure to high levels of NO2 were 10% more likely to have high use of migraine drugs than people without cumulative exposure to high levels. People with cumulative exposure to high levels of PM2.5 were 9% more likely to have high use of the drugs.

Researchers found that climate conditions played a role in the effects of pollution. High temperatures and low humidity amplified the effect of NO2, while cold and humid conditions intensified the effect of PM2.5.

Implications for care and prevention

“These findings highlight opportunities for anticipating what care will be needed,” Peles said. “As climate change intensifies the frequency of heat waves, dust storms and pollution episodes, we will need to integrate these environmental risk factors into our guidance for people with migraine.

“When high-risk exposure periods are in the forecast, doctors can advise people to limit their outdoor activity and use air filters, take short-term preventative medications and start using their migraine drugs at the first sign of a problem to ward off attacks.”

Study limitations and who it reflects

A limitation of the study is that exposure to air pollution was measured by monitoring stations and did not take into account individual behaviors such as amount of time spent indoors, use of air conditioning or air filters, type of job and daily activities.

In addition, since the information on migraine activity was gathered through hospital and clinic visits and pharmacy data, the findings mainly reflect people with severe migraine and may not be applicable to people with milder episodes of migraine or those they manage on their own.

Genetic atlas reveals how human liver cells divide their labor

If scientists could shrink themselves to microscopic size and take a journey through the human body—like the submarine crew in the 1966 science fiction classic “Fantastic Voyage”—one of their first stops would no doubt be the liver. The unique structure of our largest internal organ comprises small, hexagonal functional units called lobules, each carrying out more than 500 functions simultaneously. Studies from the 1970s and 1980s revealed that liver cells divide these many tasks among themselves according to their location within each subunit; however, the technology available at the time provided only a blurred picture of this division of labor.

A new high‑resolution liver atlas

In a new study published in Nature, scientists from the Weizmann Institute of Science, together with colleagues at Sheba Medical Center and the Mayo Clinic, present the first genetic atlas of a healthy human liver at a resolution of 2 microns. The findings show that the division of labor in the human liver differs from that of other mammals and is more extensive than previously recognized, helping explain why certain regions of the liver are particularly vulnerable to fatty liver disease.

In recent years, technological advances have made it possible to identify which genes are active in each individual cell while also mapping the cells’ precise spatial positions within the tissue. Still, a comprehensive map of functional division in the human liver remained elusive, largely due to the difficulty of obtaining tissue samples from healthy donors.

Leveraging living liver donations

Researchers in Prof. Shalev Itzkovitz’s group at the Weizmann Institute realized that the solution could come from altruistic living liver donation. Because the liver has a remarkable capacity for regeneration, healthy individuals can donate a substantial portion of their livers to patients in need.

With the help of Prof. Ido Nachmany and Prof. Niv Pencovich from the Department of General Surgery and Transplantation at Sheba Medical Center, and Dr. Timucin Taner from the Mayo Clinic in Minnesota, the researchers obtained eight liver samples from healthy donors and constructed a gene expression atlas of the human liver.

“Thousands of genes were found to be active at different levels in liver cells across various locations, pointing to a far more precise and complex internal organization than previously thought,” says Itzkovitz. “Instead of the coarse division into three functional zones that has been accepted for decades, the atlas reveals eight regions with distinct roles. This precise mapping now enables any laboratory worldwide to dive deep into the liver and investigate why different regions are susceptible to different diseases.

“Metabolic diseases, for example, tend to originate in the center of the lobule, whereas viral and autoimmune inflammations primarily appear at its periphery. Likewise, liver cancer and metastases from other cancers have their preferred locations. The key to understanding these patterns lies in the detailed genetic data we have collected.”

How human livers differ from animals

To enable comparison with other species, Itzkovitz’s laboratory also mapped healthy livers in mice, as well as in larger mammals—pigs and cows—whose metabolic rates and lobule sizes are similar to those of humans. In all mammals, blood flows through the lobule from the periphery to the center, supplying oxygen and nutrients to cells along the way. As a result, the periphery is characterized by abundance of resources, while the center experiences relative scarcity.

In all the mammals studied except humans, these depleted conditions at the center of the lobule resulted in relatively lower cellular activity. In humans, however, the core of the lobule was found to carry out numerous functions, including synthesizing fats from excess energy, producing glucose from non-carbohydrate sources during fasting, filtering toxins and secreting bile to aid digestion.

Another striking difference between the human liver and those of other mammals concerns glucose storage. The liver functions as the body’s “fuel tank,” efficiently absorbing sugars during meals and releasing them in a controlled manner between meals. The study found that in humans, glucose uptake occurs mainly in the centers of the lobules, rather than at their periphery, unlike in mice.

The double‑edged nature of efficiency

“This division of labor is both a blessing and a curse,” Itzkovitz explains. “It allows the liver to store carbohydrates efficiently: Cells at the center of the lobule absorb and store glucose directly from the blood, while cells at the periphery convert lactate into glucose, further contributing to the energy reserves used during fasting. However, this efficient system was not designed for a modern diet rich in fats and carbohydrates, which may help explain why we tend to accumulate excess fat in the liver and develop liver fibrosis.”

To cope with cellular wear and tear and prevent disease, a unique turnover mechanism appears to have evolved in the center of the human liver lobule. “We found that in humans, unlike in other mammals, a particular type of immune cell prefers to reside in the core of the lobule rather than guarding its periphery—the entry point of blood into the tissue,” says Dr. Oran Yakubovsky of Itzkovitz’s lab, who led the study and is also a surgical resident at Sheba Medical Center.

“Kupffer cells are specialized scavenger cells that can offer protection against infections but also engulf, break down and recycle the remains of worn-out cells. We hypothesize that in humans they ‘relocated’ to the center to cope with the increased cellular attrition occurring there.”

Linking the atlas to liver disease

In the final part of the study, the scientists demonstrated how their new atlas can be used to trace disease development. They focused on fatty liver disease associated with metabolic dysfunction—a common condition, linked to obesity and diabetes, in which fat accumulates in the liver and may lead to inflammation and fibrosis.

Comparing healthy liver cells with those that had begun to accumulate fat revealed a protective response: Cells that started to “gain weight” switched off genes involved in fat production and uptake while activating genes associated with fat breakdown. However, the human liver has a limitation that reduces the efficiency of this process: Fat accumulation also leads to decreased production of certain components of the mitochondria, the organelles responsible for breaking down fats.

“Based on the precise mapping of the liver, it may become possible to develop treatments that will target the genes responsible for making specific regions particularly vulnerable to certain diseases,” says Itzkovitz. “Moreover, the approach of constructing a single-cell–resolution genetic atlas from healthy donor samples can be applied to other organs that have not yet been accurately mapped in humans. It could fundamentally change how we understand the structure and function of the human body.”

CRISPR variant selectively targets tumor DNA

Cancer cells excel at evading detection, but subtle chemical differences set them apart from healthy cells. Now, a team of scientists from Wageningen University & Research and Van Andel Institute has identified a way to exploit this distinction. Using a variant of CRISPR, a modern tool for editing DNA, they distinguished tumor DNA from healthy DNA and selectively cut only the former.

The study, published in Nature, is an early but promising step toward a cancer therapy that targets and destroys tumor cells with high precision.

The new method relies on methyl groups, small chemical tags attached to DNA that regulate whether genes are on or off. This process, called DNA methylation, is altered in cancer cells and can act as a molecular “fingerprint” that differentiates malignant cells from healthy ones.

Precision gene editing with ThermoCas9

The team conducted the study using ThermoCas9, a CRISPR variant discovered in bacteria several years ago by Wageningen’s John van der Oost, Ph.D. Like other CRISPR systems, researchers can program ThermoCas9 to locate and cut specific sections of DNA within a cell.

VAI’s Hong Li, Ph.D., and her lab analyzed ThermoCas9’s structure and found that it can distinguish between unmethylated and methylated genes.

The team then introduced ThermoCas9 into human cells grown in culture dishes: healthy cells in one set of dishes and tumor cells in another set of dishes. This approach worked: ThermoCas9 cut DNA in tumor cells while leaving healthy DNA intact. The system therefore proved capable of detecting the subtle chemical differences between healthy and tumor cells and acting on them.

“ThermoCas9 is the first CRISPR-associated enzyme to respond to differences in the most abundant type of DNA methylation in human and other eukaryotic cells,” Van der Oost said. “This means we now have a system that we can target specifically toward tumor cells.”

The study represents the first time a CRISPR-based method has relied on methylation to target human cancer cells.

“ThermoCas9 uses methylation like an address to precisely target cancer cells while leaving healthy cells untouched,” Li said. “The findings could be a game changer.”

A precise molecular fit

The explanation for ThermoCas9’s selective behavior lies in the way it binds to DNA. Before a CRISPR system cuts DNA, it must first attach to a short recognition sequence next to its target, known as the PAM (Protospacer Adjacent Motif). ThermoCas9 is unique in that its PAM sequence includes a human methylation site, meaning it can contain a methyl group.

“The CRISPR system binds very precisely to this recognition code,” Van der Oost explained.

Compare it to a screwdriver that fits perfectly into a matching screw head. If there is a protrusion inside the groove, the screwdriver no longer fits, nor is it capable of performing its job. In the same way, a methyl group disrupts the fit between ThermoCas9 and the DNA, preventing binding and leaving the DNA sequence untouched.

“ThermoCas9 is a perfect example of the value of fundamental research; you have to know how these individual pieces work together,” Li said. “We used biochemistry and structural biology to discover a mechanism that we one day hope will lead to more precise, effective cancer treatment.”

Steps toward clinical research

There is still a long way to go before the technology can be translated into a potential cancer treatment. The new study demonstrates selective DNA cleavage but does not yet show that this effect can kill tumor cells. The next step focuses on damaging tumor DNA sufficiently to trigger cell death.

Aberrant methylation patterns also play a role in many other diseases, including childhood cancers such as neuroblastoma and autoimmune disorders. In the future, ThermoCas9 or a similar CRISPR tool may evolve into a versatile molecular strategy that recognizes diseased cells by their chemical “signature” and selectively disables them.

New technique maps cancer drug uptake inside living cells

A new analytical method could improve how cancer treatments are designed—by allowing scientists to track, for the first time, exactly where inside a living cell a drug accumulates. Researchers from the University of Surrey and King’s College London developed the method, which detects trace amounts of metal inside individual living cells and their internal compartments without the need to kill the cells first.

Published in Spectrochimica Acta Part B: Atomic Spectroscopy, the study looked at a class of cancer therapy called targeted radionuclide therapy. This works by attaching a radioactive particle to a molecule that seeks out tumor cells, delivering radiation directly to the cancer. Where inside the cell the drug ends up is critical. A drug that reaches the nucleus causes damage to cancer by targeting DNA. Until now, there was no reliable way to measure this in living cells.

Dr. Monica Felipe-Sotelo, senior lecturer in radiochemistry and analytical chemistry, co-author of the study from the University of Surrey, said, “We developed this method using two specialist facilities—the SEISMIC facility at King’s College London and the University of Surrey’s ICP-MS facility. Together, they allowed us to combine the cell-sampling and metal-detection steps in a single workflow for the first time. That combination is what makes it possible to ask not just whether a drug gets into a cell, but precisely where it goes once it’s there.”

The team used tiny glass capillary tips—10 micrometers wide for whole cells, 3 micrometers for subcellular structures—to extract individual living pancreatic cancer cells and material from within them, including mitochondria, under a microscope.

The SEISMIC facility at King’s, a specialist system for extracting material from single living cells, provided the sampling capability. Surrey’s laser ablation inductively coupled plasma mass spectrometry (ICP-MS) facility then enabled detection and measurement of thallium present using LA-ICP-MS—a technique that uses a laser to vaporize minute quantities of material before a mass spectrometer identifies and quantifies the metals within. The combination of capillary sampling at the sub-cellular level and LA-ICP-MS has not been performed before.

The researchers used thallium chloride as a chemically stable stand-in for thallium-201, a radioactive isotope under investigation as a cancer treatment candidate. Thallium was successfully detected in individual cancer cells and, for the first time, inside mitochondria-enriched material extracted from those cells, at extremely low amounts.

Dr. Claire Davison from King’s College London, said, “Thallium-201 is exciting as a potential cancer therapy precisely because its radiation acts over such a short distance—which means it could destroy tumor cells while sparing the healthy tissue around them. But that precision cuts both ways: the drug has to end up in the right part of the cell to do its job. This method gives us, for the first time, a way to find that out in living cells, and that is a significant step towards making this type of therapy work in practice.”

Dr. Dany Beste, senior lecturer in microbial metabolism from the University of Surrey, said, “The potential here goes well beyond cancer. Metals play important roles in a wide range of diseases—from infectious disease to diabetes and liver conditions—and we have few tools for studying exactly where they are accumulating within cells. This methodology gives us a way to do that with a level of precision and in conditions that are much closer to biological reality. That opens up a lot of questions we could not previously ask.”

Professor Melanie Bailey from King’s College London said, “We are continuing to develop this methodology at the SEISMIC facility and working with various different users to determine precisely where other drugs go when they enter cells, and what they do when they get there.”

The technique could be extended beyond cancer research to study how any metal-based drug or toxic substance distributes inside living cells. The team identified the extraction of additional cellular compartments—including the nucleus, where radiation damage to DNA occurs—as a key next step. Improving methods to verify the purity of the extracted subcellular material is also identified as a priority for future development.

error: Content is protected !!