Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1743

Aug 28, 2019

Using Wi-Fi-like sonar to measure speed and distance of indoor movement

Posted by in categories: drones, internet, mobile phones, robotics/AI

Researchers from North Carolina State University have developed a technique for measuring speed and distance in indoor environments, which could be used to improve navigation technologies for robots, drones—or pedestrians trying to find their way around an airport. The technique uses a novel combination of Wi-Fi signals and accelerometer technology to track devices in near-real time.

“We call our approach Wi-Fi-assisted Inertial Odometry (WIO),” says Raghav Venkatnarayan, co-corresponding author of a paper on the work and a Ph.D. student at NC State. “WIO uses Wi-Fi as a velocity sensor to accurately track how far something has moved. Think of it as sonar, but using radio waves, rather than sound waves.”

Many devices, such as smartphones, incorporate technology called inertial measurement units (IMUs) to calculate how far a has moved. However, IMUs suffer from large drift errors, meaning that even minor inaccuracies can quickly become exaggerated.

Aug 27, 2019

Researchers use machine learning to teach robots how to trek through unknown terrains

Posted by in categories: biotech/medical, education, engineering, information science, robotics/AI

A team of Australian researchers has designed a reliable strategy for testing physical abilities of humanoid robots—robots that resemble the human body shape in their build and design. Using a blend of machine learning methods and algorithms, the research team succeeded in enabling test robots to effectively react to unknown changes in the simulated environment, improving their odds of functioning in the real world.

The findings, which were published in a joint publication of the IEEE and the Chinese Association of Automation Journal of Automatica Sinica in July, have promising implications in the broad use of in fields such as healthcare, education, disaster response and entertainment.

“Humanoid robots have the ability to move around in many ways and thereby imitate human motions to complete complex tasks. In order to be able to do that, their stability is essential, especially under dynamic and unpredictable conditions,” said corresponding author Dacheng Tao, Professor and ARC Laureate Fellow in the School of Computer Science and the Faculty of Engineering at the University of Sydney.

Aug 27, 2019

Bioinspired robots can now learn to swarm on the go

Posted by in categories: food, robotics/AI

A new generation of swarming robots which can independently learn and evolve new behaviors in the wild is one step closer, thanks to research from the University of Bristol and the University of the West of England (UWE).

The team used artificial evolution to enable the robots to automatically learn swarm behaviors which are understandable to humans. This new advance published today in Advanced Intelligent Systems, could create new robotic possibilities for environmental monitoring, disaster recovery, infrastructure maintenance, logistics and agriculture.

Until now, artificial evolution has typically been run on a computer which is external to the swarm, with the best strategy then copied to the robots. However, this approach is limiting as it requires external infrastructure and a laboratory setting.

Aug 27, 2019

Artificial muscles bloom, dance, and wave

Posted by in categories: biotech/medical, cyborgs, robotics/AI, wearables

Wearing a flower brooch that blooms before your eyes sounds like magic. KAIST researchers have made it real with robotic muscles.

Researchers have developed an ultrathin, for soft robotics. The advancement, recently reported in the journal Science Robotics, was demonstrated with a robotic blooming flower brooch, dancing robotic butterflies and fluttering tree leaves on a kinetic art piece.

The robotic equivalent of a that can move is called an . The actuator expands, contracts or rotates like using a stimulus such as electricity. Engineers around the world are striving to develop more dynamic actuators that respond quickly, can bend without breaking, and are very durable. Soft, robotic muscles could have a wide variety of applications, from wearable electronics to advanced prosthetics.

Aug 27, 2019

Silicon Valley Company Lands NASA Contract For Breakthroughs In 3D Printing In Space

Posted by in categories: 3D printing, robotics/AI, space

MOUNTAIN VIEW (KPIX 5) — A Silicon Valley 3D printing company has been awarded a contract with NASA to launch a project creating a satellite that will manufacture and assemble itself in orbit.

A top NASA administrator visited Mountain View’s Ames Research Center Monday and toured state-of-the-art facilities of Made In Space. NASA awarded Made in Space a $73 million contract to launch Archinaut by 2022, an “autonomous robotic manufacturing and assembly platform.”

Jim Bridenstine, the space agency‘s top official, called Ames a “jewel” and praised the work of Made In Space as “impressive.” The manufacturing company 3D prints structures, parts, tools and more while in orbit.

Aug 27, 2019

New Revolutionary Artificial Skin

Posted by in categories: cyborgs, robotics/AI

This artificial skin could give robots a sense of touch.

Aug 27, 2019

Neuromorphic Chips and the Future of Your Cell Phone

Posted by in categories: mobile phones, robotics/AI, transportation

Summary: The ability to train large scale CNNs directly on your cell phone without sending the data round trip to the cloud is the key to next gen AI applications like real time computer vision and safe self-driving cars. Problem is our current GPU AI chips won’t get us there. But neuromorphic chips look like they will.

Aug 27, 2019

Big Developments Bring Us Closer to Fully Untethered Soft Robots

Posted by in categories: 3D printing, engineering, robotics/AI

Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Caltech have developed new soft robotic systems that are inspired by origami. These new systems are able to move and change shape in response to external stimuli. The new developments bring us closer to having fully untethered soft robots. The soft robots that we possess today use external power and control. Because of this, they have to be tethered to off-board systems with hard components.

The research was published in Science Robotics. Jennifer A. Lewis, a Hansjorg Wyss Professor of Biologically Inspired Engineering at SEAS and co-lead author of the study, spoke about the new developments.

“The ability to integrate active materials within 3D-printed objects enables the design and fabrication of entirely new classes of soft robotic matter,” she said.

Aug 27, 2019

Newly Developed Cameras Use Light to See Around Corners

Posted by in categories: engineering, information science, particle physics, robotics/AI

David Lindell, a graduate student in electrical engineering at Stanford University, along with his team, developed a camera that can watch moving objects around corners. When they tested the new technology, Lindell wore a high visibility tracksuit as he moved around an empty room. They had a camera that was aimed at a blank wall away from Lindell, and the team was able to watch all of his movements with the use of a high powered laser. The laser reconstructed the images through the use of single particles of light that were reflected onto the walls around Lindell. The newly developed camera used advanced sensors and a processing algorithm.

Gordon Wetzstein, assistant professor of electrical engineering at Stanford, spoke about the newly developed technology.

“People talk about building a camera that can see as well as humans for applications such as autonomous cats and robots, but we want to build systems that go well beyond that,” he said. “We want to see things in 3D, around corners and beyond the visible light spectrum.”

Aug 26, 2019

Researchers Created AI That Hides Your Emotions From Other AI

Posted by in categories: internet, robotics/AI

Humans can communicate a range of nonverbal emotions, from terrified shrieks to exasperated groans. Voice inflections and cues can communicate subtle feelings, from ecstasy to agony, arousal and disgust. Even when simply speaking, the human voice is stuffed with meaning, and a lot of potential value if you’re a company collecting personal data.

Now, researchers at the Imperial College London have used AI to mask the emotional cues in users’ voices when they’re speaking to internet-connected voice assistants. The idea is to put a “layer” between the user and the cloud their data is uploaded to by automatically converting emotional speech into “normal” speech. They recently published their paper “Emotionless: Privacy-Preserving Speech Analysis for Voice Assistants” on the arXiv preprint server.

Our voices can reveal our confidence and stress levels, physical condition, age, gender, and personal traits. This isn’t lost on smart speaker makers, and companies such as Amazon are always working to improve the emotion-detecting abilities of AI.