Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

The simulated Milky Way: 100 billion stars using 7 million CPU cores

Researchers have successfully performed the world’s first Milky Way simulation that accurately represents more than 100 billion individual stars over the course of 10 thousand years. This feat was accomplished by combining artificial intelligence (AI) with numerical simulations. Not only does the simulation represent 100 times more individual stars than previous state-of-the-art models, but it was produced more than 100 times faster.

Published in Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, the study represents a breakthrough at the intersection of astrophysics, high-performance computing, and AI. Beyond astrophysics, this new methodology can be used to model other phenomena such as and .

Species in crisis: Critically endangered penguins are directly competing with fishing boats

A new study led by the University of St Andrews has found that critically endangered African penguins (Spheniscus demersus) are significantly more likely to forage in the same areas as commercial fishing vessels during years of low fish abundance, increasing competition for food and adding pressure to a species already in crisis.

Published in the Journal of Applied Ecology, the research introduces a novel metric called “overlap intensity” which for the first time measures not just the extent of shared space between and fishing vessels, but how many penguins are actually affected by this overlap.

The African penguin population has plummeted by nearly 80% in the past three decades, in part due to competition with the local fishery targeting sardines and anchovies, a key prey for the penguins.

MIT Neuroscientist Proposes Brain Waves are the Hidden Engine Behind Thought and Consciousness

When it comes to understanding the mystery of human consciousness, scientists have long sought the hidden mechanism that transforms mere neural firing into the rich experience of thought.

Now, a leading MIT neuroscientist believes he’s found a clue that suggests the brain’s electrical waves don’t just reflect our thoughts, but actually create them.

At the Society for Neuroscience’s annual meeting on November 15, Dr. Earl K. Miller, a professor at MIT’s Picower Institute for Learning and Memory, will unveil a provocative proposal: that cognition and consciousness emerge from the fast, flexible organization of the brain’s cortex—powered by analog computations performed by traveling brain waves.

In other words, the rhythm of the brain may be more than background noise—it may be the very pulse of thought itself.

“The brain uses these oscillatory waves to organize itself,” Dr. Miller said in a press statement. “Cognition is large-scale neural self-organization. The brain has got to organize itself to perform complex behaviors. Brain waves are the patterns of excitation and inhibition that organize the brain, and this leads to consciousness because consciousness is this organized knitting together of the cortex.”

Dr. Miller’s theory revives the concept of analog computation. Unlike digital computers, which rely on discrete binary bits, analog systems process continuous information—waves interacting to produce a vast range of possible values.

Dr. Miller argues that the brain’s natural oscillations—electrical waves generated by millions of neurons—function as analog computers, sculpting information in a fast, flexible, and energy-efficient way.

MAKER: Large Language Models (LLMs) have achieved remarkable breakthroughs in reasoning, insight generation, and tool use

They can plan multi-step actions, generate creative solutions, and assist in complex decision-making. Yet these strengths fade when tasks stretch over long, dependent sequences. Even small per-step error rates compound quickly, turning an impressive short-term performance into complete long-term failure.

That fragility poses a fundamental obstacle for real-world systems. Most large-scale human and organizational processes – from manufacturing and logistics to finance, healthcare, and governance – depend on millions of actions executed precisely and in order. A single mistake can cascade through an entire pipeline. For AI to become a reliable participant in such processes, it must do more than reason well. It must maintain flawless execution over time, sustaining accuracy across millions of interdependent steps.

Apple’s recent study, The Illusion of Thinking, captured this challenge vividly. Researchers tested advanced reasoning models such as Claude 3.7 Thinking and DeepSeek-R1 on structured puzzles like Towers of Hanoi, where each additional disk doubles the number of required moves. The results revealed a sharp reliability cliff: models performed perfectly on simple problems but failed completely once the task crossed about eight disks, even when token budgets were sufficient. In short, more “thinking” led to less consistent reasoning.

Uncovering new physics in metals manufacturing

For decades, it’s been known that subtle chemical patterns exist in metal alloys, but researchers thought they were too minor to matter — or that they got erased during manufacturing. However, recent studies have shown that in laboratory settings, these patterns can change a metal’s properties, including its mechanical strength, durability, heat capacity, radiation tolerance, and more.

Now, researchers at MIT have found that these chemical patterns also exist in conventionally manufactured metals. The surprising finding revealed a new physical phenomenon that explains the persistent patterns.

In a paper published in Nature Communications today, the researchers describe how they tracked the patterns and discovered the physics that explains them. The authors also developed a simple model to predict chemical patterns in metals, and they show how engineers could use the model to tune the effect of such patterns on metallic properties, for use in aerospace, semiconductors, nuclear reactors, and more.

Nanoparticle–stem cell hybrids open a new horizon in bone regeneration

A research team in South Korea has successfully developed a novel technology that combines nanoparticles with stem cells to significantly improve 3D bone tissue regeneration. This advancement marks a step forward in the treatment of bone fractures and injuries, as well as in next-generation regenerative medicine.

The research is published in the journal ACS Biomaterials Science & Engineering.

Dr. Ki Young Kim and her team at the Korea Research Institute of Chemical Technology (KRICT), in collaboration with Professor Laura Ha at Sunmoon University, have engineered a nanoparticle-stem cell hybrid, termed a nanobiohybrid by integrating mesoporous silica nanoparticles (mSiO₂ NPs) with human adipose-derived mesenchymal (hADMSCs). The resulting hybrid cells demonstrated markedly enhanced osteogenic (bone-forming) capability.

Interplay Between Aging and Glial Cell Dysfunction: Implications for CNS Health

Aging is accompanied by complex cellular and molecular changes that compromise CNS function. Among these, glial cells (astrocytes, microglia, and oligodendrocytes) play a central role in maintaining neural homeostasis, modulating synaptic activity, and supporting metabolic demands. Emerging evidence indicates that aging disrupts glial cell physiology through processes including mitochondrial dysfunction, impaired proteostasis, chronic low-grade inflammation, and altered intercellular signaling. These alterations contribute to synaptic decline, myelin degeneration, and persistent, low-grade inflammation of the CNS. This review synthesizes current knowledge on the bidirectional relationship between aging and glial cell dysfunction, highlighting how age-related systemic and CNS-specific factors exacerbate glial impairments and, in turn, accelerate neural deterioration.

/* */