Toggle light / dark theme

Modern brain–computer interfaces (BCI), utilizing electroencephalograms for bidirectional human–machine communication, face significant limitations from movement-vulnerable rigid sensors, inconsistent skin–electrode impedance, and bulky electronics, diminishing the system’s continuous use and portability. Here, we introduce motion artifact–controlled micro–brain sensors between hair strands, enabling ultralow impedance density on skin contact for long-term usable, persistent BCI with augmented reality (AR). An array of low-profile microstructured electrodes with a highly conductive polymer is seamlessly inserted into the space between hair follicles, offering high-fidelity neural signal capture for up to 12 h while maintaining the lowest contact impedance density (0.03 kΩ·cm−2) among reported articles. Implemented wireless BCI, detecting steady-state visually evoked potentials, offers 96.4% accuracy in signal classification with a train-free algorithm even during the subject’s excessive motions, including standing, walking, and running. A demonstration captures this system’s capability, showing AR-based video calling with hands-free controls using brain signals, transforming digital communication. Collectively, this research highlights the pivotal role of integrated sensors and flexible electronics technology in advancing BCI’s applications for interactive digital environments.

Since most cells naturally repel DNA, delivering these nanodevices into cells requires specialized techniques, such as transfection methods and transformation protocols. Once inside, cellular factors such as salt concentration, molecular crowding, and heterogeneous environments influence strand displacement reactions. To overcome the limitations of direct delivery, researchers are also developing transcribable RNA nanodevices encoded into plasmids or chromosomes, allowing cells to express these circuits.

Toward Smart DNA Machines and Biocomputers

DNA strand displacement has been applied to the innovation of computational models. By integrating computational principles with DNA strand displacement, the structured algorithms of traditional computing can be combined with random biochemical processes and chemical reactions in biological systems, enabling biocompatible models of computation. In the future, DNA strand displacement may enable autonomously acting DNA nanomachines to precisely manipulate biological processes, leading to quantum leaps in healthcare and life science research.

Just as pilots use flight simulators to safely practice complex maneuvers, scientists may soon conduct experiments on a highly realistic simulation of the mouse brain. In a new study, researchers at Stanford Medicine and their collaborators developed an artificial intelligence.

Artificial Intelligence (AI) is a branch of computer science focused on creating systems that can perform tasks typically requiring human intelligence. These tasks include understanding natural language, recognizing patterns, solving problems, and learning from experience. AI technologies use algorithms and massive amounts of data to train models that can make decisions, automate processes, and improve over time through machine learning. The applications of AI are diverse, impacting fields such as healthcare, finance, automotive, and entertainment, fundamentally changing the way we interact with technology.

That’s the question raised by physicist Dr. Richard Lieu at The University of Alabama in Huntsville. In a paper published in the Monthly Notices of the Royal Astronomical Society, Lieu offers a theory that could challenge one of the biggest assumptions in astrophysics. His idea: gravity can exist without any mass at all.

The study explores a different solution to the same equations that normally describe gravity—both in Newtonian theory and in general relativity. These equations link mass with the gravitational force it creates. Lieu focused on what’s known as the Poisson equation, a simplified form of Einstein’s field equations used for describing gravity in weaker fields, like those around galaxies.

This equation typically has one well-known solution: gravity that weakens with distance, created by mass. But there’s another, lesser-known solution that’s often ignored. It can also create an attractive force but doesn’t come from any actual matter.

How does the brain work? Where, and when, and why do neurons connect and send their signals? Scientists have created the largest wiring diagram and functional map of an animal brain to date to learn more. Research teams at Allen Institute, @BCMweb and @princeton worked together to map half a billion synapses, over 200,000 cells, and 4km of axons from a cubic millimeter of mouse brain, providing unparalleled detail into its structure and functional properties. The project is part of the Machine Intelligence from Cortical Networks (MICrONS) program, which seeks to revolutionize machine learning by reverse-engineering the algorithms of the brain. Research findings reveal key insights into brain activity, connectivity, and structure—shedding light on both form and function—within a region of the mouse visual cortex that plays a critical role in brain health and is often disrupted in neurological conditions such as Alzheimer’s disease, autism, and addiction. These insights could revolutionize our ability to treat neuropsychiatric diseases or study the influence of drugs and other changes on the brain.

This extraordinary achievement begins to reveal the elusive language the brain uses to communicate amongst its millions of cells and the cortical mechanisms of intelligence—one of the holy grails of science.

Learn more about this research: https://alleninstitute.org/news/scien… open science data: https://www.microns-explorer.org/ Explore the publications in Nature: https://www.nature.com/immersive/d428… Follow us on social media: Bluesky — https://bsky.app/profile/alleninstitu… Facebook — / alleninstitute X — / alleninstitute Instagram — / alleninstitute LinkedIn — / allen-institute TikTok — / allen.institute.
Access open science data: https://www.microns-explorer.org/
Explore the publications in Nature: https://www.nature.com/immersive/d428

Follow us on social media:

Bluesky — https://bsky.app/profile/alleninstitu
Facebook — / alleninstitute.
X — / alleninstitute.
Instagram — / alleninstitute.
LinkedIn — / allen-institute.
TikTok — / allen.institute

A machine learning method has the potential to revolutionize multi-messenger astronomy. Detecting binary neutron star mergers is a top priority for astronomers. These rare collisions between dense stellar remnants produce gravitational waves followed by bursts of light, offering a unique opportunit

Enthusiasts have been pushing the limits of silicon for as long as microprocessors have existed. Early overclocking endeavors involved soldering and replacing crystal clock oscillators, but that practice quickly evolved into adjusting system bus speeds using motherboard DIP switches and jumpers.

Internal clock multipliers were eventually introduced, but it didn’t take long for those to be locked down, as unscrupulous sellers began removing official frequency ratings and rebranding chips with their own faster markings. System buses and dividers became the primary tuning tools for most users, while ultra-enthusiasts went further – physically altering electrical specifications through hard modding.

Eventually, unlocked multipliers made a comeback, ushering in an era defined by BIOS-level overclocking and increasingly sophisticated software tuning tools. Over the past decade, however, traditional overclocking has become more constrained. Improved factory binning, aggressive turbo boost algorithms, and thermal ceilings mean that modern CPUs often operate near their peak potential right out of the box.

Quantum computers promise to outperform today’s traditional computers in many areas of science, including chemistry, physics, and cryptography, but proving they will be superior has been challenging. The most well-known problem in which quantum computers are expected to have the edge, a trait physicists call “quantum advantage,” involves factoring large numbers, a hard math problem that lies at the root of securing digital information.

In 1994, Caltech alumnus Peter Shor (BS ‘81), then at Bell Labs, developed a that would easily factor a large number in just seconds, whereas this type of problem could take a classical computer millions of years. Ultimately, when quantum computers are ready and working—a goal that researchers say may still be a decade or more away—these machines will be able to quickly factor large numbers behind cryptography schemes.

But, besides Shor’s algorithm, researchers have had a hard time coming up with problems where quantum computers will have a proven advantage. Now, reporting in a recent Nature Physics study titled “Local minima in ,” a Caltech-led team of researchers has identified a common physics problem that these futuristic machines would excel at solving. The problem has to do with simulating how materials cool down to their lowest-energy states.

Pressure waves propagating through bubble-containing liquids in tubes experience considerable attenuation. Researchers at the University of Tsukuba have derived an equation describing this phenomenon, demonstrating that beyond liquid viscosity and compressibility, variations in tube cross-sectional area contribute to wave attenuation.

Their analysis reveals that the rate of change in tube cross-sectional area represents a critical parameter governing pressure wave attenuation in such systems.

Pressure waves propagating through bubble-containing liquids in , known as “bubbly ,” behave distinctly from those in single-phase liquids, necessitating precise understanding and control of their propagation processes.