Once only a part of science fiction, lasers are now everyday objects used in research, health care and even just for fun. Previously available only in low-energy light, lasers are now available in wavelengths from microwaves through X-rays, opening a range of different downstream applications.
In a study published in Nature, an international collaboration led by scientists at the University of Wisconsin–Madison has generated the shortest hard X-ray pulses to date through the first demonstration of strong lasing phenomena.
The resulting pulses can lead to several potential applications, from quantum X-ray optics to visualizing electron motion inside molecules.
A research team led by Prof. Zhang Tianshu at the Hefei Institutes of Physical Science of the Chinese Academy of Sciences has developed a compact all-solid-state continuous-wave (CW) single-longitudinal-mode (SLM) laser with high frequency stability using iodine-based frequency locking, advancing its application in atmospheric remote sensing and environmental monitoring. The study is published in Optics and Laser Technology.
CW SLM lasers are widely used in areas such as laser amplification, gravitational wave detection, and quantum optics. They also play a key role in atmospheric remote sensing and environmental monitoring. These applications require not only SLM laser output but also high frequency stability, which current semiconductor and fiber lasers struggle to provide due to limited environmental adaptability.
In this study, the team introduced a ring resonator structure combined with iodine molecular absorption frequency locking technology. By locking the laser frequency to the flank of specific iodine absorption lines and employing feedback control to adjust the resonator length, they achieved long-term frequency stability.
Scientists have demonstrated after decades of theorising how light interacts with vacuum, recreating a bizarre phenomenon predicted by quantum physics.
Oxford University physicists ran simulations to test how intense laser beams alter vacuum, a state once thought to be empty but predicted by quantum physics to be full of fleeting, temporary particle pairs.
Classical physics predicts that light beams pass through each other undisturbed. But quantum mechanics holds that even what we know as vacuum is always brimming with fleeting particles, which pop in and out of existence, causing light to be scattered.
We’ve questioned that model and tackled questions from a different angle – by looking inward instead of outward.
Instead of starting with an expanding universe and asking how it began, we considered what happens when an over-density of matter collapses under gravity.
Prof Gaztanaga explained that the theory developed by his team of researchers worked within the principles of quantum mechanics and the model could be tested scientifically.
How can the strange properties of quantum particles be exploited to perform extremely accurate measurements? This question is at the heart of the research field of quantum metrology. One example is the atomic clock, which uses the quantum properties of atoms to measure time much more accurately than would be possible with conventional clocks.
However, the fundamental laws of quantum physics always involve a certain degree of uncertainty. Some randomness or a certain amount of statistical noise has to be accepted. This results in fundamental limits to the accuracy that can be achieved. Until now, it seemed to be an immutable law that a clock twice as accurate requires at least twice as much energy.
Now a team of researchers from TU Wien, Chalmers University of Technology, Sweden, and the University of Malta has demonstrated that special tricks can be used to increase accuracy exponentially. The crucial point is using two different time scales—similar to how a clock has a second hand and a minute hand.
Technology veteran IBM on Tuesday laid out a plan to have a “practical” quantum computer tackling big problems before the end of this decade.
Current quantum computers are still experimental and face significant challenges, including high error rates. Companies like IBM, Google, and others are working to build more stable and scalable quantum systems.
Real-world innovations that quantum computing has the potential to tackle include developing better fuels, materials, pharmaceuticals, or even new elements. However, delivering on that promise has always seemed some way off.
Xanadu has achieved a significant milestone in the development of scalable quantum hardware by generating error-resistant photonic qubits on an integrated chip platform. A foundational result in Xanadu’s roadmap, this first-ever demonstration of such qubits on a chip is published in Nature.
This advance builds on Xanadu’s recent announcement of the Aurora system, which demonstrated—for the first time—all key components required to build a modular, networked, and scalable photonic quantum computer. With this latest demonstration of robust qubit generation using silicon-based photonic chips, Xanadu further strengthens the scalability pillar of its architecture.
The quantum states produced in this experiment, known as Gottesman–Kitaev–Preskill (GKP) states, consist of superpositions of many photons to encode information in an error-resistant manner—an essential requirement for future fault-tolerant quantum computers. These states allow logic operations to be performed using deterministic, room-temperature-compatible techniques, and they are uniquely well-suited for networking across chips using standard fiber connections.
IBM has just unveiled its boldest quantum computing roadmap yet: Starling, the first large-scale, fault-tolerant quantum computer—coming in 2029. Capable of running 20,000X more operations than today’s quantum machines, Starling could unlock breakthroughs in chemistry, materials science, and optimization.
According to IBM, this is not just a pie-in-the-sky roadmap: they actually have the ability to make Starling happen.
In this exclusive conversation, I speak with Jerry Chow, IBM Fellow and Director of Quantum Systems, about the engineering breakthroughs that are making this possible… especially a radically more efficient error correction code and new multi-layered qubit architectures.
We cover: - The shift from millions of physical qubits to manageable logical qubits. - Why IBM is using quantum low-density parity check (qLDPC) codes. - How modular quantum systems (like Kookaburra and Cockatoo) will scale the technology. - Real-world quantum-classical hybrid applications already happening today. - Why now is the time for developers to start building quantum-native algorithms.
00:00 Introduction to the Future of Computing. 01:04 IBM’s Jerry Chow. 01:49 Quantum Supremacy. 02:47 IBM’s Quantum Roadmap. 04:03 Technological Innovations in Quantum Computing. 05:59 Challenges and Solutions in Quantum Computing. 09:40 Quantum Processor Development. 14:04 Quantum Computing Applications and Future Prospects. 20:41 Personal Journey in Quantum Computing. 24:03 Conclusion and Final Thoughts.
The National Institute of Information and Communications Technology of Japan, in collaboration with Sony Semiconductor Solutions Corporation (Sony), has developed the world’s first practical surface-emitting laser that employs quantum dot (QD) as the optical gain medium for use in optical fiber communication systems.
This achievement was made possible by NICT’s high-precision crystal growth technology and Sony’s advanced semiconductor processing technology. The surface-emitting laser developed in this study incorporates nanoscale semiconductor structures called quantum dots as light-emitting materials. This innovation not only facilitates the miniaturization and reduced power consumption of light sources in optical fiber communications systems but also offers potential cost reductions through mass production and enhanced output via integration.
The results of this research are published in Optics Express.