Toggle light / dark theme

Nanoprinted high-neuron-density optical linear perceptrons perform near-infrared inference on a CMOS chip

Today, machine learning permeates everyday life, with millions of users every day unlocking their phones through facial recognition or passing through AI-enabled automated security checks at airports and train stations. These tasks are possible thanks to sensors that collect optical information and feed it to a neural network in a computer.

Scientists in China have presented a new nanoscale AI trained to perform unpowered all-optical inference at the speed of light for enhanced authentication solutions. Combining smart optical devices with imaging sensors, the system performs complex functions easily, achieving a neural density equal to 1/400th that of the human brain and a more than 10 orders of magnitude higher than electronic processors.

Imagine empowering the sensors in everyday devices to perform artificial intelligence functions without a computer—as simply as putting glasses on them. The integrated holographic perceptrons developed by the research team at University of Shanghai for Science and Technology led by Professor Min Gu, a foreign member of the Chinese Academy of Engineering, can make that a reality. In the future, its neural density is expected to be 10 times that of human brain.

Online Public Offering

“” We’re looking at Flippy as a tool that helps us increase speed of service and frees team members up to focus more on other areas we want to concentrate on, whether that’s order accuracy or how we’re handling delivery partner drivers and getting them what they need when they come through the door.”, said White Castle’s Vice President, Jamie Richardson.”


Flippy is the world’s first autonomous robotic kitchen assistant that can learn from its surroundings and acquire new skills over time.

Next Stop: The Moon for 27,000+ CAP Names

NEW: A microchip carrying more than 27,000 Civil Air Patrol names with related messages and images is set to be carried to the moon later this year aboard space robotics company Astrobotic’s Peregrine lunar lander. https://www.cap.news/next-stop-the-moon-for-27000-cap-names/


A microchip carrying more than 27000 Civil Air Patrol names with related messages and images is set to be carried to the moon later this year aboard space robotics company Astrobotic’s Peregrine lunar lander.

A submersible soft robot survived the pressure in the Mariana trench

A silicone robot has survived a journey to 10900 metres below the ocean’s surface in the Mariana trench, where the crushing pressure can implode all but the strongest enclosures. This device could lead to lighter and more nimble submersible designs.

A team led by Guorui Li at Zhejiang University in China based the robot’s design on snailfish, which have relatively delicate, soft bodies and are among the deepest-living fish. They have been observed swimming at depths of more than 8000 metres.

The submersible robot looks a bit like a manta ray and is 22 centimetres long and 28 centimetres in wingspan. It is made of silicone rubber with electronic components spread throughout the body and connected by wires, rather than mounted on a circuit board like most submersibles. That’s because the team found in tests that the connections between components on rigid circuit boards were a weak point when placed under high pressure.

Facebook’s New AI Teaches Itself to See With Less Human Help

The Facebook research builds upon steady progress in tweaking deep learning algorithms to make them more efficient and effective. Self-supervised learning previously has been used to translate text from one language to another, but it has been more difficult to apply to images than words. LeCun says the research team developed a new way for algorithms to learn to recognize images even when one part of the image has been altered.


Most image recognition algorithms require lots of labeled pictures. This new approach eliminates the need for most of the labeling.