Toggle light / dark theme

Webb maps the mysterious upper atmosphere of Uranus

For the first time, an international team of astronomers have mapped the vertical structure of Uranus’s upper atmosphere, uncovering how temperature and charged particles vary with height across the planet. Using Webb’s NIRSpec instrument, the team observed Uranus for nearly a full rotation, detecting the faint glow from molecules high above the clouds.

These unique data provide the most detailed portrait yet of where the planet’s auroras form, how they are influenced by its unusually tilted magnetic field, and how Uranus’s atmosphere has continued to cool over the past three decades. The results, published in Geophysical Research Letters, offer a new window into how ice-giant planets distribute energy in their upper layers.

Led by Paola Tiranti of Northumbria University in the United Kingdom, the study mapped out the temperature and density of ions in the atmosphere extending up to 5,000 kilometers above Uranus’s cloud tops, a region called the ionosphere where the atmosphere becomes ionized and interacts strongly with the planet’s magnetic field. The measurements show that temperatures peak between 3,000 and 4,000 kilometers, while ion densities reach their maximum around 1,000 kilometers, revealing clear longitudinal variations linked to the complex geometry of the magnetic field.

Could a recently reported high-energy neutrino event be explained by an exploding primordial black hole?

The KM3NeT collaboration is a large research group involved in the operation of a neutrino telescope network in the deep Mediterranean Sea, with the aim of detecting high-energy neutrino events. These are rare and fleeting high-energy interactions between neutrinos, particles with an extremely low mass that are sometimes referred to as “ghost particles.”

Recently, the KM3NeT collaboration reported an extremely high-energy neutrino event, which carried an energy of approximately 220 PeV (peta-electron volts). This is one of the most energetic events recorded to date and its cosmological origin has not yet been identified.

Researchers at Universidade de São Paulo and Universidad Autónoma de Madrid carried out a theoretical study exploring one proposed explanation for this remarkable neutrino event, namely that it originated from the explosion of a primordial black hole near Earth.

Quantum entanglement pushes optical clocks to new precision

By replacing single atoms with an entangled pair of ions, physicists in Germany have demonstrated unprecedented stability in an optical clock. Publishing their results in Physical Review Letters, a team led by Kai Dietze at the German National Metrology Institute, hope their approach could help usher in a new generation of optical clocks—opening up new possibilities in precision experiments and metrology.

To measure the passing of time, every clock works by counting oscillations of some reference frequency—whether it’s the swinging pendulum of a clocktower, or the vibrations of an electrified quartz crystal in a modern digital clock. Timekeeping accuracy is directly tied to how reliable these oscillations are: while a pendulum can accrue noticeable variations in its swing, vibrating quartz is far more reliable, making quartz clocks far more accurate.

Today, optical clocks are the most precise timekeepers ever achieved. In these devices, atoms are first “probed” by an ultra-stable laser tuned close to a specific optical transition. When the laser frequency matches the energy difference between two electronic states, an electron is excited to a higher energy level.

Lab-in-the-loop framework enables rapid evolution of complex multi-mutant proteins

The search space for protein engineering grows exponentially with complexity. A protein of just 100 amino acids has 20100 possible variants—more combinations than atoms in the observable universe. Traditional engineering methods might test hundreds of variants but limit exploration to narrow regions of the sequence space. Recent machine learning approaches enable broader searches through computational screening. However, these approaches still require tens of thousands of measurements, or 5–10 iterative rounds.

With the advent of these foundational protein models, the bottleneck for protein engineering swings back to the lab. For a single protein engineering campaign, researchers can only efficiently build and test hundreds of variants. What is the best way to choose those hundreds to most effectively uncover an evolved protein with substantially increased function? To address this problem, researchers have developed MULTI-evolve, a framework for efficient protein evolution that applies machine learning models trained on datasets of ~200 variants focused specifically on pairs of function-enhancing mutations.

Published in Science, this work represents Arc Institute’s first lab-in-the-loop framework for biological design, where computational prediction and experimental design are tightly integrated from the outset, reflecting a broader investment in AI-guided research.

Particles don’t always go with the flow (and why that matters)

It is commonly assumed that tiny particles just go with the flow as they make their way through soil, biological tissue, and other complex materials. But a team of Yale researchers led by Professor Amir Pahlavan shows that even gentle chemical gradients, such as a small change in salt concentration, can dramatically reshape how particles move through porous materials. Their results are published in Science Advances.

How small particles known as colloids, like fine clays, microbes, or engineered particles, move through porous materials such as soil, filters, and biological tissue can have significant and wide-ranging effects on everything from environmental cleanups to agriculture.

It’s long been known that chemical gradients—that is, gradual changes in the concentration of salt or other chemicals—can drive colloids to migrate directionally, a phenomenon known as diffusiophoresis. But it was often assumed that this effect would matter only when there was little or no flow, because phoretic speeds are typically orders of magnitude smaller than average flow speeds in porous media. Experiments set up in Pahlavan’s lab demonstrated a very different outcome.

Microscopic mirrors for future quantum networks: A new way to make high-performance optical resonators

Researchers in the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Faculty of Arts and Sciences have devised a new way to make some of the smallest, smoothest mirrors ever created for controlling single particles of light, known as photons. These mirrors could play key roles in future quantum computers, quantum networks, integrated lasers, environmental sensing equipment, and more.

A team from the labs of Marko Lončar, the Tiantsai Lin Professor of Electrical Engineering at SEAS; Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics; and Kiyoul Yang, assistant professor of electrical engineering at SEAS; have described their new method for making high-performance, curved optical mirrors in a study published in Optica.

Using two such mirrors to trap light between them, the team demonstrated state-of-the-art optical resonators that can control light at near-infrared wavelengths, which is important for manipulating single atoms in quantum computing applications.

Machine learning algorithm fully reconstructs LHC particle collisions

The CMS Collaboration has shown, for the first time, that machine learning can be used to fully reconstruct particle collisions at the LHC. This new approach can reconstruct collisions more quickly and precisely than traditional methods, helping physicists better understand LHC data. The paper has been submitted to the European Physical Journal C and is currently available on the arXiv preprint server.

Each proton–proton collision at the LHC sprays out a complex pattern of particles that must be carefully reconstructed to allow physicists to study what really happened. For more than a decade, CMS has used a particle-flow (PF) algorithm, which combines information from the experiment’s different detectors, to identify each particle produced in a collision. Although this method works remarkably well, it relies on a long chain of hand-crafted rules designed by physicists.

The new CMS machine-learning-based particle-flow (MLPF) algorithm approaches the task fundamentally differently, replacing much of the rigid hand-crafted logic with a single model trained directly on simulated collisions. Instead of being told how to reconstruct particles, the algorithm learns how particles look in the detectors, like how humans learn to recognize faces without memorizing explicit rules.

Measuring chaos: Researchers quantify the quantum butterfly effect

For the first time, researchers in China have accurately quantified how chaos increases in a quantum many-body system as it evolves over time. Combining experiments and theory, a team led by Yu-Chen Li at the University of Science and Technology of China showed that the level of chaos grows exponentially when time reversal is applied to these systems—matching predictions of their extreme sensitivity to errors. The research has been published in Physical Review Letters.

The butterfly effect is a well-known expression of chaos theory. It describes how a complex system can quickly become unpredictable as it evolves: make just a few small errors when specifying the system’s starting conditions, and it may look completely different from your calculations a short time later.

This effect is especially relevant in many-body quantum systems, where entanglement creates intricate webs of interconnection between particles—even in relatively small systems. As the system evolves, information about its initial state becomes increasingly dispersed across these connections.

Record-breaking photons at telecom wavelengths

A team of researchers from the University of Stuttgart and the Julius-Maximilians-Universität Würzburg led by Prof. Stefanie Barz (University of Stuttgart) has demonstrated a source of single photons that combines on-demand operation with record-high photon quality in the telecommunications C-band—a key step toward scalable photonic quantum computation and quantum communication. “The lack of a high-quality on-demand C-band photon source has been a major problem in quantum optics laboratories for over a decade—our new technology now removes this obstacle,” says Prof. Stefanie Barz.

The key: Identical photons on demand In everyday life, distinguishing features may often be desirable. Few want to be exactly like everyone else. When it comes to quantum technologies, however, complete indistinguishability is the name of the game. Quantum particles such as photons that are identical in all their properties can interfere with each other—much as in noise-canceling headphones, where sound waves that are precisely inverted copies of the incoming noise cancel out the background.

When identical photons are made to act in synchrony, then the probability that certain measurement outcomes occur can be either boosted or decreased. Such quantum effects give rise to powerful new phenomena that lie at the heart of emerging technologies such as quantum computing and quantum networking. For these technologies to become feasible, high-quality interference between photons is essential.

/* */