Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

Small quantum system outperforms large classical networks in real-world forecasting

Can a handful of atoms outperform a much larger digital neural network on a real-world task? The answer may be yes. In a study published in Physical Review Letters, a team led by Prof. Peng Xinhua and Assoc. Prof. Li Zhaokai from the University of Science and Technology of China of the Chinese Academy of Sciences demonstrated that a quantum processor comprising just nine interacting spins outperforms classical networks with thousands of nodes in realistic weather forecasting tasks.

By exploiting unique quantum features such as superposition and entanglement, quantum devices offer new ways to represent and process information.

Recent experiments have shown their advantages in specialized benchmark tasks, but extending these gains to real-world applications remains a challenge. In particular, many quantum approaches rely on complex circuits that are difficult to implement accurately on today’s noisy hardware.

Research moves closer to ‘smart’ sensors in knee replacements

If you have a knee replacement, imagine pointing your phone at your knee and pulling up an app that tells you how much stress the artificial joint is experiencing. Knowing the activities that cause the biggest problems—which can lead to a second replacement surgery—would be invaluable. Research led by Binghamton University is closer to making this technology a reality.

Professor Shahrzad “Sherry” Towfighian—a faculty member from the Thomas J. Watson College of Engineering and Applied Science’s Department of Mechanical Engineering—has worked toward “smart-knee” tech over the past decade.

According to the American College of Rheumatology, nearly 800,000 total knee replacements are done every year in the U.S., and that number is expected to rise sharply by 2030 as the population ages and sports injuries become more common.

How the human brain builds our sense of time

How does Jannik Sinner manage to hit the ball at exactly the right moment, with remarkable precision? And how do we, in everyday life, perceive the duration of events around us? The answer lies in how the brain constructs the perception of time, as shown by research published in PLOS Biology by Valeria Centanino, Gianfranco Fortunato, and Domenica Bueti. Starting from what we see—such as an approaching ball—temporal information is processed by the brain through progressively more complex stages: from the occipital visual cortex, to parietal and premotor areas, and finally to frontal regions.

Using high-field functional magnetic resonance imaging (fMRI) and measuring time perception in healthy volunteers, the researchers shed light on what happens in the brain when we estimate the duration of a visual stimulus. “Our results show that time perception is not a unitary process, but the outcome of multiple processing stages distributed across the cerebral cortex,” the authors explain. “Each stage contributes differently, from encoding physical duration to constructing the subjective experience of time.”

In an initial stage, occipital visual areas encode duration through gradual (monotonic) neural responses: the longer the stimulus, the stronger the neural response. This information is then transformed in parietal and premotor regions into selective (unimodal) representations, where distinct neural populations respond preferentially to specific durations, enabling the “readout” of time. Finally, higher-order regions, including the frontal cortex and anterior insula, are involved in the subjective categorization of duration, shaping how time is perceived.

New memristor design uses built-in oxygen gradient to bring stability to reinforcement learning

In a recent study published in Nature Communications, researchers created a memristor that uses a built-in oxygen gradient to produce slow, stable conductance changes, enabling a reinforcement learning (RL) algorithm to learn faster and more stably than conventional approaches.

Reinforcement learning stands as one of the most promising ways to achieve continual learning in AI. The idea is to replicate how biological systems acquire and adapt knowledge slowly over time. The brain achieves this via ion gradients that regulate slow, directional signaling across cell membranes. Replicating this in hardware is a key goal of neuromorphic computing.

With their ability to mimic synaptic behavior, memristors have long been considered strong candidates for this. However, most existing devices suffer from unpredictable, abrupt conductance changes, making sustained and stable learning difficult.

Breaking fuel cell barriers: New platinum catalyst brings high-efficiency hydrogen vehicles closer to commercialization

A research team has developed a next-generation platinum-based catalyst that improves both activity and durability in hydrogen fuel cells. The study is published in Advanced Materials. The team was led by Professor Sang Uck Lee of the School of Chemical Engineering at Sungkyunkwan University, with Ph.D. candidate Jun Ho Seok as a co-first author and Dr. Sung Chan Cho, in collaboration with Professor Kwangyeol Lee’s team at Korea University and Dr. Sung Jong Yoo’s team at the Korea Institute of Science and Technology (KIST).

Hydrogen fuel cells generate electricity through the electrochemical reaction of hydrogen and oxygen and are considered a promising clean energy technology. However, their broader commercialization has been hindered by the sluggish oxygen reduction reaction (ORR) at the cathode and by catalyst degradation during long-term operation.

Conventional platinum-based intermetallic catalysts are known for their structural stability, but their atomic composition and arrangement are difficult to tune precisely. This has limited efforts to optimize their electronic structure and has made it challenging to achieve both high catalytic activity and long-term durability under demanding operating conditions, such as those required for hydrogen-powered vehicles.

Seed banks may complicate gene drives aimed at controlling weeds

Gene drives—a genetic engineering approach that quickly spreads specific genetic changes throughout a population, whether to kill it off or add a new trait—may have potential for controlling weeds. But so far, gene drives have primarily been studied in mosquitoes, and have yet to be deployed in the real world.

In a first-of-its-kind study, researchers modeled how a gene drive would proceed in plants. Their simulations suggest that a gene drive’s success may hinge on seed banks—underground reservoirs of seeds that can germinate years or even decades later. Without proper consideration, they found, these stored seeds can slow down or even doom the gene drive, because they continually reintroduce plants without the gene drive into the population.

Modeling studies like this one can help scientists design successful gene drives in plants and discover and mitigate potential problems before deployment in the wild, the researchers said.

Forget Wi-Fi This Laser Tech Hits 360 Gbps at Half the Power

A new laser-powered wireless system uses light to deliver data at speeds exceeding 360 Gbps. It could enable faster, more efficient indoor networks while reducing interference and energy use.

Modern life runs on fast, reliable wireless connections. Video calls, streaming, virtual reality, and connected devices all depend on networks that already support billions of users. Most of this data travels over radio-based systems like Wi-Fi and cellular networks. These technologies have powered decades of growth, but they are running into limits. Radio spectrum is becoming crowded, signals can interfere with each other in busy indoor spaces, and energy use keeps rising as more devices come online.

Using light instead of radio waves.

/* */