Toggle light / dark theme

Across the cosmos, many stars can be found in pairs, gracefully circling one another. Yet one of the most dramatic pairings occurs between two orbiting black holes, formed after their massive progenitor stars exploded in supernova blasts. If these black holes lie close enough together, they will ultimately collide and form an even more massive black hole.

Sometimes a black hole is orbited by a neutron star—the dense corpse of a star also formed from a supernova explosion but which contains less mass than a black hole. When these two bodies finally merge, the black hole will typically swallow the neutron star whole.

To better understand the extreme physics underlying such a grisly demise, researchers at Caltech are using supercomputers to simulate black hole–neutron star collisions. In one study appearing in The Astrophysical Journal Letters, the team, led by Elias Most, a Caltech assistant professor of theoretical astrophysics, developed the most detailed simulation yet of the violent quakes that rupture a neutron star’s surface roughly a second before the black hole consumes it.

Merging neutron stars are excellent targets for multi-messenger astronomy. This modern and still very young method of astrophysics coordinates observations of the various signals from one and the same astrophysical source. When two neutron stars collide, they emit gravitational waves, neutrinos and radiation across the entire electromagnetic spectrum. To detect them, researchers need to add gravitational wave detectors and neutrino telescopes to ordinary telescopes that capture light.

Precise models and predictions of the expected signals are essential in order to coordinate these observatories, which are very different in nature.

“Predicting the multi-messenger signals from binary neutron star mergers from first principles is extremely difficult. We have now succeeded in doing just that,” says Kota Hayashi, a postdoctoral researcher in the Computational Relativistic Astrophysics department at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute) in the Potsdam Science Park. “Using the Fugaku supercomputer in Japan, we have performed the longest and most complex simulation of a binary neutron star to date.”

A research study led by Oxford University has developed a powerful new technique for finding the next generation of materials needed for large-scale, fault-tolerant quantum computing. This could end a decades-long search for inexpensive materials that can host unique quantum particles, ultimately facilitating the mass production of quantum computers.

The results have been published in the journal Science.

Quantum computers could unlock unprecedented computational power far beyond current supercomputers. However, the performance of quantum computers is currently limited, due to interactions with the environment degrading the quantum properties (known as quantum decoherence). Physicists have been searching for materials resistant to quantum decoherence for decades, but the search has proved experimentally challenging.

Back in 2018, a scientist from the University of Texas at Austin proposed a protocol to generate randomness in a way that could be certified as truly unpredictable. That scientist, Scott Aaronson, now sees that idea become a working reality. “When I first proposed my certified randomness protocol in 2018, I had no idea how long I’d need to wait to see an experimental demonstration of it,” said Aaronson, who now directs a quantum center at a major university.

The experiment was carried out on a cutting-edge 56-qubit quantum computer, accessed remotely over the internet. The machine belongs to a company that recently made a significant upgrade to its system. The research team included experts from a large bank’s tech lab, national research centers, and universities.

To generate certified randomness, the team used a method called random circuit sampling, or RCS. The idea is to feed the quantum computer a series of tough problems, known as challenge circuits. The computer must solve them by choosing among many possible outcomes in a way that’s impossible to predict. Then, classical supercomputers step in to confirm whether the answers are genuinely random or not.

The boundaries of computing are shifting as biology fuses with technology. At the center of this new frontier is an emerging concept: a liquid computer powered by DNA. With the ability to support more than 100 billion unique circuits, this system could soon transform how we detect and diagnose disease.

While DNA is best known for encoding life, researchers are now exploring its potential as a computing tool. A team led by Dr. Fei Wang at Shanghai Jiao Tong University believes DNA can do much more than carry genetic instructions.

Their study, recently published in Nature, reveals how DNA molecules could become the core components of new computing systems. Rather than just holding genetic data, DNA could behave like wires, instructions, or even electrons inside biological circuits.

In 2025, China tech is no longer just catching up—it’s rewriting the rules. From quantum computers that outperform U.S. supercomputers to humanoid robots priced for mass adoption, China tech is accelerating at a pace few imagined. In this video, Top 10 Discoveries Official explores the 8 cutting-edge breakthroughs that prove China tech is reshaping transportation, AI, clean energy, and even brain-computer interfaces. While the West debates and regulates, China tech builds—from driverless taxis and flying cars to homegrown AI chips and thorium reactors. Watch now to understand why the future might not be written in Silicon Valley, but in Shenzhen.

#chinatech #chinaai #chinanews #top10discoveriesofficial

Researchers at IBM and Lockheed Martin teamed up high-performance computing with quantum computing to accurately model the electronic structure of ‘open-shell’ molecules, methylene, which has been a hurdle with classic computing over the years. This is the first demonstration of the sample-based quantum diagonalization (SQD) technique to open-shell systems, a press release said.

Quantum computing, which promises computations at speeds unimaginable by even the fastest supercomputers of today, is the next frontier of computing. Leveraging quantum states of molecules to serve as quantum bits, these computers supersede computational capabilities that humanity has had access to in the past and open up new research areas.

We’re announcing the world’s first scalable, error-corrected, end-to-end computational chemistry workflow. With this, we are entering the future of computational chemistry.

Quantum computers are uniquely equipped to perform the complex computations that describe chemical reactions – computations that are so complex they are impossible even with the world’s most powerful supercomputers.

However, realizing this potential is a herculean task: one must first build a large-scale, universal, fully fault-tolerant quantum computer – something nobody in our industry has done yet. We are the farthest along that path, as our roadmap, and our robust body of research, proves. At the moment, we have the world’s most powerful quantum processors, and are moving quickly towards universal fault tolerance. Our commitment to building the best quantum computers is proven again and again in our world-leading results.

Plasma—the electrically charged fourth state of matter—is at the heart of many important industrial processes, including those used to make computer chips and coat materials.

Simulating those plasmas can be challenging, however, because millions of math operations must be performed for thousands of points in the simulation, many times per second. Even with the world’s fastest supercomputers, scientists have struggled to create a kinetic simulation—which considers individual particles—that is detailed and fast enough to help them improve those manufacturing processes.

Now, a new method offers improved stability and efficiency for kinetic simulations of what’s known as inductively coupled plasmas. The method was implemented in a developed as part of a private-public partnership between the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) and chip equipment maker Applied Materials Inc., which is already using the tool. Researchers from the University of Alberta, PPPL and Los Alamos National Laboratory contributed to the project.

A research team from the Department of Energy’s Oak Ridge National Laboratory, in collaboration with North Carolina State University, has developed a simulation capable of predicting how tens of thousands of electrons move in materials in real time, or natural time rather than compute time.

The project reflects a longstanding partnership between ORNL and NCSU, combining ORNL’s expertise in time-dependent quantum methods with NCSU’s advanced quantum simulation platform developed under the leadership of Professor Jerry Bernholc.

Using the Oak Ridge Leadership Computing Facility’s Frontier supercomputer, the world’s first to break the exascale barrier, the research team developed a real-time, time-dependent density functional theory, or RT-TDDFT, capability within the open-source Real-space Multigrid, or RMG, code to model systems of up to 24,000 electrons.