Toggle light / dark theme

Supercomputer simulations test turbulence theories at record 35 trillion grid points

Using the Frontier supercomputer at the Department of Energy’s Oak Ridge National Laboratory, researchers from the Georgia Institute of Technology have performed the largest direct numerical simulation (DNS) of turbulence in three dimensions, attaining a record resolution of 35 trillion grid points. Tackling such a complex problem required the exascale (1 billion billion or more calculations per second) capabilities of Frontier, the world’s most powerful supercomputer for open science.

The team’s results offer new insights into the underlying properties of the turbulent fluid flows that govern the behaviors of a variety of natural and engineered phenomena—from ocean and air currents to combustion chambers and airfoils. Improving our understanding of turbulent fluctuations can lead to practical advancements in many areas, including more accurately predicting the weather and designing more efficient vehicles.

The work is published in the Journal of Fluid Mechanics.

A Simple Chemical Tweak Unlocks One of Quantum Computing’s Holy Grails

Even supercomputers can stall out on problems where nature refuses to play by everyday rules. Predicting how complex molecules behave or testing the strength of modern encryption can demand calculations that grow too quickly for classical hardware to keep up. Quantum computers are designed to tackle that kind of complexity, but only if engineers can build systems that run with extremely low error rates.

One of the most promising routes to that reliability involves a rare class of materials called topological superconductors. In plain terms, these are superconductors that also have built-in “protected” quantum behavior, which researchers hope could help shield delicate quantum information from noise. The catch is that making materials with these properties is famously difficult.

Simulations and supercomputing calculate one million orbits in cislunar space

Satellites and spacecraft in the vast region between the earth and moon and just beyond — called cislunar space — are crucial for space exploration, scientific advancement and national security. But figuring out where exactly to put them into a stable orbit can be a huge, computationally expensive challenge.

In an open-access database and with publicly available code, researchers at Lawrence Livermore National Laboratory (LLNL) have simulated and published one million orbits in cislunar space. The effort, enabled by supercomputing resources at the Laboratory, provides valuable data that can be used to plan missions, predict how small perturbations might change orbits and monitor space traffic.

To begin, the Space Situational Awareness Python package takes in a range of initial conditions for an orbit, like how elliptical and tilted the orbit is and how far it gets from the earth.

NASA researchers probe tangled magnetospheres of merging neutron stars

New simulations performed on a NASA supercomputer are providing scientists with the most comprehensive look yet into the maelstrom of interacting magnetic structures around city-sized neutron stars in the moments before they crash. The team identified potential signals emitted during the stars’ final moments that may be detectable by future observatories.

“Just before neutron stars crash, the highly magnetized, plasma-filled regions around them, called magnetospheres, start to interact strongly. We studied the last several orbits before the merger, when the entwined magnetic fields undergo rapid and dramatic changes, and modeled potentially observable high-energy signals,” said lead scientist Dimitrios Skiathas, a graduate student at the University of Patras, Greece, who is conducting research for the Southeastern Universities Research Association in Washington at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

A paper describing the findings is published in the The Astrophysical Journal.

New light-based platform sets the stage for future quantum supercomputers

A light has emerged at the end of the tunnel in the long pursuit of developing quantum computers, which are expected to radically reduce the time needed to perform some complex calculations from thousands of years down to a matter of hours.

A team led by Stanford physicists has developed a new type of “optical cavity” that can efficiently collect single photons, the fundamental particle of light, from single atoms. These atoms act as the building blocks of a quantum computer by storing “qubits”—the quantum version of a normal computer’s bits of zeros and ones. This work enables that process for all qubits simultaneously, for the first time.

In a study published in Nature, the researchers describe an array of 40 cavities containing 40 individual atom qubits as well as a prototype with more than 500 cavities. The findings indicate a way to ultimately create a million-qubit quantum computer network.

NASA Launches Its Most Powerful, Efficient Supercomputer

NASA is announcing the availability of its newest supercomputer, Athena, an advanced system designed to support a new generation of missions and research projects. The newest member of the agency’s High-End Computing Capability project expands the resources available to help scientists and engineers tackle some of the most complex challenges in space, aeronautics, and science.

Housed in the agency’s Modular Supercomputing Facility at NASA’s Ames Research Center in California’s Silicon Valley, Athena delivers more computing power than any other NASA system, surpassing the capabilities of its predecessors, Aitken and Pleiades, in power and efficiency. The new system, which was rolled out in January to existing users after a beta testing period, delivers over 20 petaflops of peak performance – a measurement of the number of calculations it can make per second – while reducing the agency’s supercomputing utility costs.

“Exploration has always driven NASA to the edge of what’s computationally possible,” said Kevin Murphy, chief science data officer and lead for the agency’s High-End Computing Capability portfolio at NASA Headquarters in Washington. “Now with Athena, NASA will expand its efforts to provide tailored computing resources that meet the evolving needs of its missions.”

Software allows scientists to simulate nanodevices on a supercomputer

From computers to smartphones, from smart appliances to the internet itself, the technology we use every day only exists thanks to decades of improvements in the semiconductor industry, that have allowed engineers to keep miniaturizing transistors and fitting more and more of them onto integrated circuits, or microchips. It’s the famous Moore’s scaling law, the observation—rather than an actual law—that the number of transistors on an integrated circuit tends to double roughly every two years.

The current growth of artificial intelligence, robotics and cloud computing calls for more powerful chips made with even smaller transistors, which at this point means creating components that are only a few nanometers (or millionths of millimeters) in size. At that scale, classical physics is no longer enough to predict how the device will function, because, among other effects, electrons get so close to each other that quantum interactions between them can hugely affect the performance of the device.

Elon Musk Holds Surprise Talk At The World Economic Forum In Davos

The musk blueprint: navigating the supersonic tsunami to hyperabundance when exponential curves multiply: understanding the triple acceleration.

On January 22, 2026, Elon Musk sat down with BlackRock CEO Larry Fink at the World Economic Forum in Davos and delivered what may be the most important articulation of humanity’s near-term trajectory since the invention of the internet.

Not because Musk said anything fundamentally new—his companies have been demonstrating this reality for years—but because he connected the dots in a way that makes the path to hyperabundance undeniable.

[Watch Elon Musk’s full WEF interview]

This is not visionary speculation.

This is engineering analysis from someone building the physical infrastructure of abundance in real-time.

Chinese military says it is developing over 10 quantum warfare weapons

China’s military says it is using quantum technology to gather high-value military intelligence from public cyberspace.

The People’s Liberation Army said more than 10 experimental quantum cyber warfare tools were “under development”, many of which were being “tested in front-line missions”, according to the official newspaper Science and Technology Daily.

The project is being led by a supercomputing laboratory at the National University of Defence Technology, according to the report, with a focus on cloud computing, artificial intelligence and quantum technology.

/* */