Toggle light / dark theme

This Magnetic Field Trick Creates Entirely New Forms of Matter

Scientists have shown that changing magnetic fields in precise ways can create exotic quantum matter that does not normally exist. The discovery could eventually lead to more reliable quantum technologies and powerful new computing systems.

Quantum technology is widely seen as one of the most promising future tools for processing massive and complicated amounts of information. Although most quantum systems are still confined to laboratories and research facilities, scientists are steadily working toward applications that could eventually impact industries across the economy.

Magnetic fields and exotic quantum states.

How Unknowable Math Can Help Hide Secrets

Perhaps the most famous example comes from a theorem by the logician Kurt Gödel’s celebrated result — one of two “incompleteness theorems” he published in 1931 — established that for any reasonable set of basic mathematical assumptions, called axioms, it’s impossible to prove that the axioms won’t eventually lead to contradictions. Though mathematicians continued their research much as they had before, they would never again be certain that their rules were self-consistent.

More than 50 years after Gödel’s theorem, cryptographers devised a radical new proof method in which unknowability played a very different role. Proofs based on this technique, called zero-knowledge proofs, can convince even the most skeptical audience that a statement is true without revealing why it’s true.

These two flavors of unknowability, which originated decades apart and in different fields, were long considered completely unrelated. Now the computer scientist Rahul Ilango (opens a new tab) has established a striking connection (opens a new tab) between them. While still a graduate student, he devised a new type of zero-knowledge proof in which secrecy stems from the fundamental limits of math. Ilango’s approach gets around limitations of zero-knowledge proofs that researchers have long thought insurmountable, pushing the boundaries of what such a proof can be. The work has also spurred researchers to explore other intriguing links between mathematical logic and cryptography.

Intel Resurrects On-Package Memory With Razor Lake-AX, Loading Up LPDDR6 to Hunt Down AMD’s Medusa Halo by 2028

Intel’s next-generation Razor Lake-AX chips will compete directly against AMD’s Medusa Halo while featuring on-package memory.

On-Package Memory was last used by Intel for its Lunar Lake SoCs. These SoCs were aimed at low-power mobile platforms, and while the chips themselves offered solid performance in a 30W budget, Intel’s next on-package memory solution will be a big one.

As per Haze2K1 on X, Intel Razor Lake-AX SoCs will feature on-package memory. This is a big deal as moving the DRAM closer to the chip itself has several advantages, leading to efficient & compact PCs. The type of memory isn’t disclosed, but we are likely looking at either LPDDR5X or the next-gen LPDDR6 standards.

Quantum circuit test finally exposes what has been warping performance

Quantum computers could someday solve pressing problems that are too convoluted for classical computers, such as modeling complex molecular interactions to streamline drug discovery and materials development.

But to build a superconducting quantum computer that is large and resilient enough for real-world applications, scientists must precisely engineer thousands of quantum circuits so they perform operations with the lowest possible error rate.

To help scientists design more predictable circuits, researchers from MIT and Lincoln Laboratory developed a technique to measure a property that can unexpectedly cause a superconducting quantum circuit to deviate from its expected behavior. Their analysis revealed the source of these distortions, known as second-order harmonic corrections, leading to underperforming circuit architectures.

Method for measuring energy amounts less than a trillionth of a billionth of a joule could boost quantum computing

The fundamentals of quantum mechanics are minuscule. Scientists constantly home in on finer resolutions to measure, quantify, and control these fundamentals, like photons that carry light and have no mass unless they are moving. The more precise the measurement, the more possibilities for better quantum technology or the ability to detect elusive dark-matter axions in deep space.

Now, researchers in Finland have successfully used a calorimeter, a type of ultra-sensitive heat-based energy sensor, to detect energy levels below one zeptojoule, or a trillionth of a billionth of a joule. For context, a zeptojoule is approximately the amount of work it takes for a red blood cell to move a nanometer, or a billionth of a meter, upwards in Earth’s gravity.

The team, led by Academy Professor Mikko Möttönen at Aalto University, together with industry collaborator IQM and the Technical Research Centre of Finland (VTT), used a novel technique to achieve the milestone measurement. The study is published in the journal Nature Electronics.

Quantum dot emitter delivers near-identical telecom photons at 40 million per second

Quantum technologies, devices that perform specific functions leveraging quantum mechanical effects, could soon outperform their classical counterparts on some tasks. Quantum emitters, devices that release individual particles of light (i.e., photons), are central components of many of these technologies, including quantum communication systems and quantum computers.

To enable the reliable operation of quantum technologies, emitters should emit photons with high consistency and coherence. In other words, they should ensure that the quantum properties of emitted photons remain stable and predictable.

Researchers at University of Copenhagen’s Niels Bohr Institute, Ruhr-University Bochum, University of Basel and Sparrow Quantum ApS recently developed a new photon emitter based on quantum dots, tiny structures that can trap electrons in confined regions and enable the controlled emission of individual photons.

How a single star can reshape an entire galaxy

Astronomers who simulate galaxies do not always get the same result, even when they start from identical conditions. New research from Leiden University shows that this is not a flaw, but a consequence of how galaxies behave—and how they are modeled.

The findings offer, for the first time, a way to address a long-standing question: how chaotic is a galaxy like the Milky Way really? The computer simulations by Tetsuro Asano and Simon Portegies Zwart (Leiden Observatory) will soon be published in Astronomy & Astrophysics and are available now on the arXiv preprint server.

The researchers created hundreds of models of Milky Way-like galaxies: flat disks of stars, embedded in a large, invisible cloud of dark matter that holds the system together. In each experiment, they ran two almost identical simulations, differing by just one tiny detail—for instance, a small shift in the position of a single star. Over time, that slight difference grows into visible structural changes: the spiral arms develop differently and the central bar rotates in another way.

Chip-scale photonic approach achieves ultralow-noise microwave and millimeter-wave signal generation

Researchers led by Dr. Changmin Ahn and Prof. Jungwon Kim at KAIST, in collaboration with Prof. Hansuek Lee, have demonstrated a chip-scale photonic approach for generating ultralow-noise and highly stable microwave and millimeter-wave signals based on optical frequency combs (microcombs), offering a potential pathway toward compact, high-performance frequency sources for next-generation technologies.

High-frequency signals in the tens to hundreds of gigahertz range are essential for emerging applications such as 6G communications, radar, and precision sensing. However, achieving both low noise and high stability at these frequencies remains a fundamental challenge for conventional electronic signal sources.

In the first study, published in Laser & Photonics Reviews, the researchers addressed the long-standing challenge of transferring the stability of an optical reference to a microcomb. Direct stabilization is difficult due to the lack of carrier-envelope offset detection in high-repetition-rate microcombs. To overcome this, they used a mode-locked laser as a transfer oscillator and synchronized it to the microcomb using electro-optic sampling.

Stanford’s new chip boosts light 100x with surprisingly low energy

Researchers at Stanford have developed a compact optical amplifier that dramatically boosts light signals using very little power. By recycling energy inside a looping resonator, the device achieves strong amplification with minimal noise and wide bandwidth. Its efficiency and small size mean it could run on batteries and be integrated into consumer electronics. This breakthrough could enable faster communications and more powerful optical technologies.

/* */