Handle is a research robot that stands 6.5 ft tall, travels at 9 mph and jumps 4 feet vertically. It uses electric power to operate both electric and hydraulic actuators, with a range of about 15 miles on one battery charge. Handle uses many of the same dynamics, balance and mobile manipulation principles found in the quadruped and biped robots we build, but with only about 10 actuated joints, it is significantly less complex. Wheels are efficient on flat surfaces while legs can go almost anywhere: by combining wheels and legs Handle can have the best of both worlds.
Archive for the ‘robotics/AI’ category: Page 2212
Feb 27, 2017
This ‘Cyborg Rose’ Grows Functioning Electronic Circulatory Inside Its Stem and Leaves
Posted by Shane Hinshaw in categories: cyborgs, robotics/AI
Scientists have figured out how to inject a conducting solution into a rose cutting, and have it spontaneously form wires throughout its stem, leaves, and petals to create fully functioning supercapacitors for energy storage.
The so-called e-Plant was able to be charged hundreds of times without any loss on the performance, and the team behind the invention says it could allow us to one day create fuel cells or autonomous energy systems inside living plants.
“A few years ago, we demonstrated that it is possible to create electronic plants, ‘power plants’, but we have now shown that the research has practical applications,” says one of the team, Magnus Berggren from Linköping University in Sweden.
Feb 27, 2017
This Neural Probe Is So Thin, The Brain Doesn’t Know It’s There
Posted by Bruno Henrique de Souza in categories: biotech/medical, cyborgs, robotics/AI
Wiring our brains up to computers could have a host of exciting applications – from controlling robotic prosthetics with our minds to restoring sight by feeding camera feeds directly into the vision center of our brains.
Most brain-computer interface research to date has been conducted using electroencephalography (EEG) where electrodes are placed on the scalp to monitor the brain’s electrical activity. Achieving very high quality signals, however, requires a more invasive approach.
Integrating electronics with living tissue is complicated, though. Probes that are directly inserted into the gray matter have been around for decades, but while they are capable of highly accurate recording, the signals tend to degrade rapidly due to the buildup of scar tissue. Electrocorticography (ECoG), which uses electrodes placed beneath the skull but on top of the gray matter, has emerged as a popular compromise, as it achieves higher-accuracy recordings with a lower risk of scar formation.
Continue reading “This Neural Probe Is So Thin, The Brain Doesn’t Know It’s There” »
Feb 27, 2017
Roborace finally reveals its self-driving racecar
Posted by Klaus Baldauf in categories: robotics/AI, transportation
Robot racing series Roborace finally pulled the wraps off its first real self-driving racecar. The British company behind the series showed off the “Robocar” for the first time ever in public during a press conference at Mobile World Congress today.
The cars of Roborace — the early design of which was revealed one year ago — were designed by Daniel Simon, the man behind the light cycles in Tron: Legacy. “I’ve worked on a lot of cool stuff — Tron, Bugatti, Star Wars — but this takes the cake,” Simon said on stage.
Feb 27, 2017
Superintelligent AI explains Softbank’s push to raise a $100BN Vision Fund
Posted by Derick Lee in categories: robotics/AI, singularity
Anyone who’s seen Softbank CEO Masayoshi Son give a keynote speech will know he rarely sticks to the standard industry conference playbook.
And his turn on the stage at Mobile World Congress this morning was no different, with Son making like Eldon Tyrell and telling delegates about his personal belief in a looming computing Singularity that he’s convinced will see superintelligent robots arriving en masse within the next 30 years, surpassing the human population in number and brainpower.
“I totally believe this concept,” he said, of the Singularity. “In next 30 years this will become a reality.”
Continue reading “Superintelligent AI explains Softbank’s push to raise a $100BN Vision Fund” »
Feb 26, 2017
Brain-machine interfaces: Bidirectional communication at last
Posted by Klaus Baldauf in categories: biotech/medical, cyborgs, robotics/AI
Since the early seventies, scientists have been developing brain-machine interfaces; the main application being the use of neural prosthesis in paralyzed patients or amputees. A prosthetic limb directly controlled by brain activity can partially recover the lost motor function. This is achieved by decoding neuronal activity recorded with electrodes and translating it into robotic movements. Such systems however have limited precision due to the absence of sensory feedback from the artificial limb. Neuroscientists at the University of Geneva (UNIGE), Switzerland, asked whether it was possible to transmit this missing sensation back to the brain by stimulating neural activity in the cortex. They discovered that not only was it possible to create an artificial sensation of neuroprosthetic movements, but that the underlying learning process occurs very rapidly. These findings, published in the scientific journal Neuron, were obtained by resorting to modern imaging and optical stimulation tools, offering an innovative alternative to the classical electrode approach.
Motor function is at the heart of all behavior and allows us to interact with the world. Therefore, replacing a lost limb with a robotic prosthesis is the subject of much research, yet successful outcomes are rare. Why is that? Until this moment, brain-machine interfaces are operated by relying largely on visual perception: the robotic arm is controlled by looking at it. The direct flow of information between the brain and the machine remains thus unidirectional. However, movement perception is not only based on vision but mostly on proprioception, the sensation of where the limb is located in space. “We have therefore asked whether it was possible to establish a bidirectional communication in a brain-machine interface: to simultaneously read out neural activity, translate it into prosthetic movement and reinject sensory feedback of this movement back in the brain”, explains Daniel Huber, professor in the Department of Basic Neurosciences of the Faculty of Medicine at UNIGE.
Providing artificial sensations of prosthetic movements.
Continue reading “Brain-machine interfaces: Bidirectional communication at last” »
Feb 26, 2017
What Does Artificial Intelligence See In A Quarter Billion Global News Photographs?
Posted by Alireza Mokri in categories: information science, robotics/AI
What deep learning algorithms can tell us about the visual narratives of the world’s news imagery, from depictions of violence to the importance of people to visual context – a look inside what we see about the world around us.
Feb 26, 2017
I want this car only if I can get in the driverless hover version
Posted by Karen Hurst in categories: robotics/AI, transportation
Imagine this car requiring no wheels as it hovers across the roads/ streets and no more flat tires.
Feb 25, 2017
In the age of robots, our schools are teaching children to be redundant
Posted by Alireza Mokri in categories: education, robotics/AI
A regime of cramming and testing is crushing young people’s instinct to learn and destroying their future.