Toggle light / dark theme

Physicists Discover a New Way To Connect Qubits Using Crystal Imperfections

A new study suggests that crystal defects in diamond may hold the key to scalable quantum interconnects. Connecting large numbers of quantum bits (qubits) into a working technology remains one of the biggest obstacles facing quantum computing. Qubits are extraordinarily sensitive, and even small di

The Linux community now has a succession plan for when Linus Torvalds checks out, after an apparently uplifting discussion about ‘our eventual march toward death’

The room discussed various options but, per LWN.net, “it is sufficient to say that there was not a lot of disagreement” before two things were agreed upon. The first was acknowledging that there are already some provisions in place, with multiple people being able to commit to Torvalds’ repository, and redundancy measures in place for the stable repository.

The hoped-for scenario is that Torvalds will decide to step back, arrange a smooth transition to any replacement himself, and go off to enjoy a long retirement. Torvalds made it known he has no plans in this direction anytime soon, but why would he.

Then the big question: what if something goes wrong that does prevent this smooth transition, whether it’s a freak skydiving incident or Bill Gates in the library with a candlestick. “As I put it in the discussion,” writes LWN.net co-founder Jonathan Corbet, “in the absence of an agreed-upon process, the community would find itself playing Calvinball at an awkward time.”

Neuralink’s Brain Chip: How It Works and What It Means

Elon Musk recently announced that Neuralink, his company aiming to revolutionize brain-computer interfaces (BCIs), has successfully implanted a brain chip in a human for the first time. The implantation of the device, called “the Link,” represents a leap forward in the realm of BCIs, which record and decode brain activity, that may allow for new innovations in health care, communication, and cognitive abilities.

Though limited information on the technology is available and Neuralink’s claims have not been independently verified, here’s a look at the Link, its functionality, and the potential implications of this groundbreaking innovation.

A New Ingredient for Quantum Error Correction

Entanglement and so-called magic states have long been viewed as the key resources for quantum error correction. Now contextuality, a hallmark of quantum theory, joins them as a complementary resource.

Machines make mistakes, and as they scale up, so too do the opportunities for error. Quantum computers are no exception; in fact, their errors are especially frequent and difficult to control. This fragility has long been a central obstacle to building large-scale devices capable of practical, universal quantum computation. Quantum error correction attempts to circumvent this obstacle, not by eliminating sources of error but by encoding quantum information in such a way that errors can be detected and corrected as they occur [1]. In doing so, the approach enables fault-tolerant quantum computation. Over the past few decades, researchers have learned that this robustness relies on intrinsically quantum resources, most notably, entanglement [2] and, more recently, so-called magic states [3].

Quantum batteries could quadruple qubit capacity while reducing energy infrastructure requirements

Scientists have unveiled a new approach to powering quantum computers using quantum batteries—a breakthrough that could make future computers faster, more reliable, and more energy efficient.

Quantum computers rely on the rules of quantum physics to solve problems that could transform computing, medicine, energy, finance, communications, and many other fields in the years ahead.

But sustaining their delicate quantum states typically requires room-sized, energy-intensive cryogenic cooling systems, as well as a system of room-temperature electronics.

Chip-sized optical amplifier can intensify light 100-fold with minimal energy

Light does a lot of work in the modern world, enabling all types of information technology, from TVs to satellites to fiber-optic cables that carry the internet across oceans. Stanford physicists recently found a way to make that light work even harder with an optical amplifier that requires low amounts of energy without any loss of bandwidth, all on a device the size of a fingertip.

Similar to sound amplifiers, optical amplifiers take a light signal and intensify it. Current small-sized optical amplifiers need a lot of power to function. The new optical amplifier, detailed in the journal Nature, solves this problem by using a method that essentially recycles the energy used to power it.

“We’ve demonstrated, for the first time, a truly versatile, low-power optical amplifier, one that can operate across the optical spectrum and is efficient enough that it can be integrated on a chip,” said Amir Safavi-Naeini, the study’s senior author and associate professor of physics in Stanford’s School of Humanities and Sciences. “That means we can now build much more complex optical systems than were possible before.”

Exclusive: Nvidia to reportedly shift 2028 chip production to Intel, reshaping TSMC strategy

TSMC’s dominance in advanced process and packaging has made it a prime target amid US manufacturing mandates. Chip customers now face mounting pressure to diversify supply chains due to cost and capacity constraints, accelerating the shift toward multi-sourcing strategies.

Recent supply chain reports reveal that Nvidia, alongside Apple, plans to collaborate with Intel on its 2028 Feynman architecture platform. Both companies are targeting “low volume, low-tier, non-core” production runs to align with Trump administration directives while preserving their core TSMC(2330.TW) relationships. This dual-foundry approach is designed to minimize mass production risks while satisfying political pressures.

Milky Way is embedded in a ‘large-scale sheet’ of dark matter, which explains motions of nearby galaxies

Computer simulations carried out by astronomers from the University of Groningen in collaboration with researchers from Germany, France and Sweden show that most of the (dark) matter beyond the Local Group of galaxies (which includes the Milky Way and the Andromeda galaxy) must be organized in an extended plane. Above and below this plane are large voids. The observed motions of nearby galaxies and the joint masses of the Milky Way and the Andromeda galaxy can only be properly explained with this “flat” mass distribution. The research, led by Ph.D. graduate Ewoud Wempe and Professor Amina Helmi, is published in Nature Astronomy.

Almost a century ago, astronomer Edwin Hubble discovered that virtually all galaxies are moving away from the Milky Way. This is important evidence for the expansion of the universe and for the Big Bang. But even in Hubble’s time, it was clear that there were exceptions. For example, our neighboring galaxy, Andromeda, is moving toward us at a speed of about 100 kilometers per second.

In fact, for half a century, astronomers have been wondering why most large nearby galaxies—with the exception of Andromeda—are moving away from us and do not seem to be affected by the mass and gravity of the so-called Local Group (the Milky Way, the Andromeda galaxy and dozens of smaller galaxies).

3D material mimics graphene’s electron flow for green computing

University of Liverpool researchers have discovered a way to host some of the most significant properties of graphene in a three-dimensional (3D) material, potentially removing the hurdles for these properties to be used at scale in green computing. The work is published in the journal Matter.

Graphene is famous for being incredibly strong, lightweight, and an excellent conductor of electricity and its applications range from electronics to aerospace and medical technologies. However, its two-dimensional (2D) structure makes it mechanically fragile and limits its use in demanding environments and large-scale applications.

Thinking on different wavelengths: New approach to circuit design introduces next-level quantum computing

Quantum computing represents a potential breakthrough technology that could far surpass the technical limitations of modern-day computing systems for some tasks. However, putting together practical, large-scale quantum computers remains challenging, particularly because of the complex and delicate techniques involved.

In some quantum computing systems, single ions (charged atoms such as strontium) are trapped and exposed to electromagnetic fields including laser light to produce certain effects, used to perform calculations. Such circuits require many different wavelengths of light to be introduced into different positions of the device, meaning that numerous laser beams have to be properly arranged and delivered to the designated area. In these cases, the practical limitations of delivering many different beams of light around within a limited space become a difficulty.

To address this, researchers from The University of Osaka investigated unique ways to deliver light in a limited space. Their work revealed a power-efficient nanophotonic circuit with optical fibers attached to waveguides to deliver six different laser beams to their destinations. The findings have been published in APL Quantum.

/* */