The controversial diet truth backed by 155 dietary surveys across 90 years that food scientists don’t want you to know.
Dan Buettner exposes why meta-analyses prove most nutritional debates wrong and reveals what centenarians actually ate as children to live past 100.
The peasant food formula that’s cheaper than a hamburger, 50 times more nutrient dense, and leaves you completely satisfied.
Plus why the 15 countries with the highest life expectancy all eat white rice daily.
Dan Buettner is a New York Times bestselling author, National Geographic Fellow, and co-producer of the Emmy Award winning Netflix series Live to 100: Secrets of the Blue Zones.
Scientists may have new answers to why pop-ups or notifications grab our attention. Turns out our attention is on a cycle, shifting seven to 10 times per second. This rhythmic occurrence may be crucial for survival, as it prevents us from becoming overly focused on one thing in our environment. It could help us to see a car backing up in a parking lot while we search for where we parked, or to duck to avoid a low-hanging tree branch on a walk while watching a kid ride a bike.
However, these windows that shift our attention could also make us more susceptible to distractions, especially in modern times. As we live in a world surrounded by screens, digital alerts, and other visual stimuli, these frequent and innate windows for shifting attention may make it easier to be pulled away from a task.
“For our ancestors who had to continue to monitor the environment for predators while foraging for food, this was a beneficial trait,” said Ian Fiebelkorn, Ph.D., assistant professor of Neuroscience at the Del Monte Institute for Neuroscience at the University of Rochester and senior author of a study out in the journal PLOS Biology. “But in our modern environment, with laptops open in front of us and a smartphone nearby, rhythmically occurring windows for beneficial attentional shifts might also work against us. That is, rhythmically occurring windows for attentional shifts are also associated with increased susceptibility to distracting information.”
The world has far more bees than anyone realized. Scientists have, for the first time, estimated just how many species of bees are out there on a global scale, offering a clearer look at how these vital pollinators are distributed around the planet. The landmark study, led by University of Wollongong (UOW) evolutionary biologist Dr. James Dorey, provides the most comprehensive count to date—broken down by continent and country—calculating there are, at a minimum, between 3,700 and 5,200 more bee species buzzing around the world than currently recognized.
The research, outlined in a new paper published Tuesday, February 24, in Nature Communications, lifts global estimates to between 24,705 and 26,164 bee species and reveals a richer and more complex picture of the world’s bees than ever before. The findings highlight how many bee species remain unclassified or overlooked, showing that even our much-loved pollinators are not fully understood, and that closing these knowledge gaps is crucial for conservation and food security.
“Knowing how many species exist in a place, or within a group like bees, really matters. It shapes how we approach conservation, land management, and even big-picture science questions about evolution and ecosystems,” Dr. Dorey said. “Bees are a perfect example. They’re keystone species; their diversity underpins healthy environments and resilient agriculture. If we don’t understand how many bee species there are, we’re missing a key part of the puzzle for protecting both nature and farming.”
To understand the complex interactions involved in an immune response during scarcity, the team put mice on a 50% restricted-calorie diet and then exposed the animals to bacteria that infect the gut. The mice that were fed a standard diet experienced a metabolic crash— their blood glucose levels and body weight plummeted.
The researchers had expected this would happen to all the animals because mounting an immune response can consume up to 30% of the entire body’s fuel reserves. But in the calorie-restricted mice, the immune system appeared to be functioning perfectly well without using much glucose.
To unravel this enigma, the researchers inventoried the immune cells of the infected animals and discovered that T cells, which normally target invading microbes, were depleted in the calorie-restricted mice. Instead, short-lived neutrophils, which serve as the body’s first responders to infection, were ramped up to twice the normal amount and had measurably enhanced pathogen-killing abilities. The cells seemed to be operating in energy-saving mode, consuming much less glucose than neutrophils from well-fed animals.
The researchers are breaking new ground by outlining how a sudden fall in food intake triggers glucocorticoid levels to rise, resulting in two major shifts. First, the body repositions certain immune cells—especially naïve T cells—into the bone marrow, which becomes a kind of “safe house” for when the cells are needed. Second, during an infection, glucocorticoids tilt the immune response away from energy-intensive T cells toward neutrophils, abundant cells that act as immediate, short-lived defenders.
Beyond clearing a current infection, glucocorticoids prepare the immune system for repeat encounters with infectious agents. While the hormones direct killer T cells to stand down and neutrophils to step up, they also ensure memory T cells are preserved for future confrontations.
When food is scarce, stress hormones direct the immune system to operate in “low power” mode to preserve immune function while conserving energy, according to researchers. This reconfiguration is crucial to combating infections amid food insecurity.
But the molecular factors responsible for the onset of Barrett’s esophagus remain poorly understood.
The findings, published in Nature Communications, combined family studies, laboratory experiments and genetically engineered mouse models to identify and understand how genetic defects contribute to disease development.
The team sequenced and analyzed genetic material of 684 people from 302 families where multiple members developed Barrett’s esophagus or esophageal cancer. They discovered that a subset of affected family members carry inherited mutations in a gene called VSIG10L.
“We found that this gene acts like a quality control system for the esophageal lining,” said the lead researcher. “When it’s defective, the cells do not mature properly and the protective barrier in the esophageal lining becomes weak, allowing stomach bile acid to cause tissue changes that enhances the risk of developing Barrett’s esophagus.”
When researchers genetically engineered mice with human-equivalent VSIG10L mutations, they found that the esophageal lining became disrupted structurally and molecularly, according to the author. The study found that when the mice were exposed to bile acid, they developed Barrett’s-like disease over time, effectively replicating the disease’s progression in humans.
These genetically engineered mice also represent the first animal model for Barrett’s esophagus based directly on human genetic predisposition to the disease, the author said.
With VSIG10L shown to be a key gene in maintaining esophageal health, family members can now be screened for genetic variants to identify those at a high-risk of developing Barrett’s esophagus or esophageal cancer. ScienceMission sciencenewshighlights.
Plant owners with a so-called green thumb often seem to have a more finely tuned sense of what their plants need than the rest of us. A new “smart lighting” system for indoor vertical farms grants this ability on a facility-wide scale, responsively meeting plants’ needs while reducing energy inefficiencies, clearing a path for indoor farms as an energy-efficient food security strategy.
The system was designed and tested in a study led by Professor of Plant Biology Tracy Lawson, who conducted the research at the University of Essex and is now a member of the Carl R. Woese Institute for Genomic Biology at the University of Illinois Urbana-Champaign. The work, published in Smart Agricultural Technology, emerged from her goal to help establish the viability of vertical farming for large-scale food production.
“One of the key aspects of [vertical farming], of course, is the energy cost associated with using LED lighting,” Lawson said. “So that’s where it all started, trying to save energy.”
The findings add to a growing body of work suggesting that ape minds can imagine scenarios beyond the “here-and-now,” a skill once thought to be unique to humans. Human children begin playing pretend as early as 12 months old and master the ability to build imaginary worlds by age 3. Many high-level thinking tasks are possible only because we can imagine things that aren’t really there.
The study centered on Kanzi, a remarkable bonobo who could communicate using word-linked symbols called lexigrams. Amalia Bastos, a comparative psychologist at the University of St Andrews in Scotland, first met him in 2023. “We were starstruck by Kanzi,” she says.
During their first meeting, the bonobo used his lexigram-studded board to ask Bastos and a colleague to chase each other. Bastos noticed that even though they only pretended to play, Kanzi still enjoyed watching them. This kick-started a series of make-believe tests that Bastos and Christopher Krupenye, a psychologist at Johns Hopkins University, designed for Kanzi.
In the first of these tests, Kanzi sat at a table with two glasses. An experimenter pretended to pour a glass of “juice” — Kanzi’s tipple of choice — into both cups from a see-through empty jug. The experimenter then poured the nonexistent contents of one cup back into the jug, before asking Kanzi which cup still held the “juice.” Kanzi guessed correctly 68 percent of the time, significantly above chance, the researchers report.
The guesses, Bastos says, may not have been definitive evidence of Kanzi’s internal imagination. “Kanzi is an old bonobo. Maybe his vision isn’t very good. Maybe he thinks that there’s real juice in these things,” she says.
The researchers retested Kanzi to see if he could identify real from fake juice. They presented him with two cups: one containing orange juice and an empty one that they filled with pretend juice. When asked which cup he wanted, Kanzi picked the real juice nearly 80 percent of the time, suggesting he had little issue identifying his reward. A third test that mimicked the first, but with pretend grapes rather than juice, again suggested Kanzi understood where pretend food was located.
More than a century ago, Pavlov trained his dog to associate the sound of a bell with food. Ever since, scientists have assumed the dog learned this through repetition. The more times the dog heard the bell and then got fed, the better it learned that the sound meant food would soon follow.
Now, scientists at UC San Francisco are upending this 100-year-old assumption about associative learning. The new theory asserts that it depends less on how many times something happens and more on how much time passes between rewards.
“It turns out that the time between these cue-reward pairings helps the brain determine how much to learn from that experience,” said Vijay Mohan K. Namboobidiri, Ph.D., an associate professor of Neurology and senior author of the study, published in Nature Neuroscience.