Toggle light / dark theme

The promise of a quantum computing revolution

Integrated circuits form the basis of modern ‘classical’ computing. There can be hundreds of these microchips in a laptop or personal computer. Their size has meant that now mobile phones have computing power thousands of times faster than the most powerful supercomputers built in the 1980s.

Since the 1990s, supercomputers have come into their own. The most powerful supercomputer in the world, Frontier based in the US, has a million times more computing power than top-tier gaming PCs. But these devices are still based on the classical technology of integrated circuits and are therefore limited in their capabilities.

Quantum computers promise to be able to process calculations thousands, even millions of times faster than modern computers.

OpenAI and NVIDIA Announce Strategic Partnership to Deploy 10 Gigawatts of NVIDIA Systems

OpenAI and NVIDIA today announced a letter of intent for a landmark strategic partnership to deploy at least 10 gigawatts of NVIDIA systems for OpenAI’s next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence. To support this deployment including data center and power capacity, NVIDIA intends to invest up to $100 billion in OpenAI as the new NVIDIA systems are deployed. The first phase is targeted to come online in the second half of 2026 using the NVIDIA Vera Rubin platform.

“NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT,” said Jensen Huang, founder and CEO of NVIDIA. “This investment and infrastructure partnership mark the next leap forward — deploying 10 gigawatts to power the next era of intelligence.”

“Everything starts with compute,” said Sam Altman, cofounder and CEO of OpenAI. “Compute infrastructure will be the basis for the economy of the future, and we will utilize what we’re building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale.”

Supercomputer unveils new cell sorting principle in microfluidic channels

Researchers have discovered a novel criterion for sorting particles in microfluidic channels, paving the way for advancements in disease diagnostics and liquid biopsies. Using the supercomputer “Fugaku,” a joint team from the University of Osaka, Kansai University and Okayama University revealed that soft particles, like biological cells, exhibit unique focusing patterns compared to rigid particles.

The outcomes, published in the Journal of Fluid Mechanics, pave the way for next-generation microfluidic devices leveraging cell and particle deformability, promising highly efficient cell sorting with such as early cancer detection.

Microfluidics involves manipulating fluids at a microscopic scale. Controlling particle movement within microchannels is crucial for cell sorting and diagnostics, expected to realize early cancer detection and treatment. While prior research focused on rigid particles, which typically focus near channel walls, the behavior of deformable particles remained largely unexplored.

New approach improves accuracy of quantum chemistry simulations using machine learning

A new trick for modeling molecules with quantum accuracy takes a step toward revealing the equation at the center of a popular simulation approach, which is used in fundamental chemistry and materials science studies.

The effort to understand materials and eats up roughly a third of national lab supercomputer time in the U.S. The gold standard for accuracy is the quantum many-body problem, which can tell you what’s happening at the level of individual electrons. This is the key to chemical and material behaviors as electrons are responsible for chemical reactivity and bonds, electrical properties and more. However, quantum many-body calculations are so difficult that scientists can only use them to calculate atoms and molecules with a handful of electrons at a time.

Density functional theory, or DFT, is easier—the computing resources needed for its calculations scale with the number of electrons cubed, rather than rising exponentially with each new electron. Instead of following each individual electron, this theory calculates electron densities—where the electrons are most likely to be located in space. In this way, it can be used to simulate the behavior of many hundreds of atoms.

Physicists create new electrically controlled silicon-based quantum device

A team of scientists at Simon Fraser University’s Quantum Technology Lab and leading Canada-based quantum company Photonic Inc. have created a new type of silicon-based quantum device controlled both optically and electrically, marking the latest breakthrough in the global quantum computing race.

The research, published in the journal Nature Photonics, reveals new diode nanocavity devices for electrical control over silicon color center qubits.

The devices have achieved the first-ever demonstration of an electrically-injected single-photon source in silicon. The breakthrough clears another hurdle toward building a quantum computer—which has enormous potential to provide computing power well beyond that of today’s supercomputers and advance fields like chemistry, materials science, medicine and cybersecurity.

Europe’s first exascale supercomputer is now up and running, using 24,000 Nvidia GH200 Superchips to perform more than one quintillion operations per second with nearly 1,000,000 terabytes of storage

Yeah, but can it play… y’know what, I’m not even gonna go there.

Brain cells simulated in the electronic brain

Europe now has an exascale supercomputer which runs entirely on renewable energy. Of particular interest: one of the 30 inaugural projects for the machine focuses on realistic simulations of biological neurons (see https://www.fz-juelich.de/en/news/effzett/2024/brain-research)

[ https://www.nature.com/articles/d41586-025-02981-1](https://www.nature.com/articles/d41586-025-02981-1)


Large language models (LLMs) work with artificial neural networks inspired by the way the brain works. Dr. Thorsten Hater (JSC) is focused on the nature-inspired models of LLMs: neurons that communicate with each other in the human brain. He wants to use the exascale computer JUPITER to perform even more realistic simulations of the behaviour of individual neurons.

Many models treat a neuron merely as a point that is connected to other points. The spikes, or electrical signals, travel along these connections. “Of course, this is overly simplified,” says Hater. “In our model, the neurons have a spatial extension, as they do in reality. This allows us to describe many processes in detail on the molecular level. We can calculate the electric field across the entire cell. And we can thus show how signal transmission varies right down to the individual neuron. This gives us a much more realistic picture of these processes.”

For the simulations, Hater uses a program called Arbor. This allows more than two million individual cells to be interconnected computationally. Such models of natural neural networks are useful, for example, in the development of drugs to combat neurodegenerative diseases like Alzheimer’s. The physicist and software developer would like to simulate and study the changes that take place in the neurons in the brain on the exascale computer.

/* */