Toggle light / dark theme

Atlas | Partners in Parkour

It’s impressive. But, i don’t see it doing anything that it hasn’t done before. The next step Has To Be equipping it with Human Level hands that can be teleoperated and possibly self operated.


Parkour is the perfect sandbox for the Atlas team at Boston Dynamics to experiment with new behaviors. In this video our humanoid robots demonstrate their whole-body athletics, maintaining its balance through a variety of rapidly changing, high-energy activities. Through jumps, balance beams, and vaults, we demonstrate how we push Atlas to its limits to discover the next generation of mobility, perception, and athletic intelligence.

How does Atlas do parkour? Go behind the scenes in the lab: https://youtu.be/EezdinoG4mk.

Parkour Atlas: https://youtu.be/LikxFZZO2sk.
More Parkour Atlas: https://youtu.be/_sBBaNYex3E

Qualcomm launches world’s first drone platform with both 5G, AI tech

Qualcomm has unveiled the world’s first drone platform and reference design that will tap in both 5G and AI technologies. The chipmaker’s Flight RB5 5G Platform condenses multiple complex technologies into one tightly integrated drone system to support a variety of use cases, including film and entertainment, security and emergency response, delivery, defense, inspection, and mapping.

The Flight RB5 5G Platform is powered by the chipmaker’s QRB5165 processor and builds upon the company’s latest IoT offerings to offer high-performance and heterogeneous computing at ultra-low power consumption.

Boston Dynamics’ robots master the art of parkour

Bottom line: Boston Dynamics’ Atlas robots may be under new ownership, but they haven’t lost any of their old tricks. The robotics design company has shared a new video featuring its agile uprights tackling an obstacle course. If you haven’t seen what these humanoid bots are capable of lately, it’s certainly worth a look.

They’ve ditched their tethers, aren’t annoying loud like they once were and exhibit very fluid movement. Aside from a couple of minor hiccups, the run was mostly flawless.

It’s even more impressive when you realize that the bots are adapting to their environment on the fly; none of their movements are “pre-programmed.”

Implantable “neurograins” may be the key to mind-controlled tech

A new kind of brain-computer interface (BCI) that uses neural implants the size of a grain of sand to record brain activity has been proven effective in rats — and one day, thousands of the “neurograins” could help you control machines with your mind.

Mind readers: BCIs are devices (usually electrodes implanted in the skull) that translate electrical signals from brain cells into commands for machines. They can allow paralyzed people to “speak” again, control robots, type with their minds, and even regain control of their own limbs.

Most of today’s interfaces can listen to just a few hundred neurons — but there are approximately 86 billion neurons in the brain. If we could monitor more neurons, in more places in the brain, it could radically upgrade what’s possible with mind-controlled tech.

New Inflatable Low-Cost Prosthetic Allows Users to Feel

The field of neuroprosthetics was around in its earliest stage in the 1950s, but it’s only just starting to show its true potential, with devices that allow amputees to feel and manipulate their surroundings.

A group of researchers from MIT and Shanghai Jiao Tong University, recently collaborated with the goal of making neuroprosthetic hands, which allow users to feel in a more accessible way. The result is an inflatable robotic hand that costs only $500 to build, making it much cheaper than comparable devices, a post from MIT reveals.

The researchers behind the new prosthetic say their device bears an uncanny resemblance to the inflatable robot in the animated film Big Hero 6. The prosthetic uses a pneumatic system to inflate and bend the fingers of the device, allowing its user to grasp objects, pour a drink, shake hands, and even pet a cat if they so wish. It allows all of this via a software program — detailed in the team’s paper in the journal Nature Biomedical Engineering — that “decodes” EMG signals the brain is sending to an injured or missing limb.

This Guy Didn’t Want to Get a Haircut in Public, So He Built a Robot Barber

Circa 2020 o.o


While some states have partially reopened and loosened restrictions on barber shops and hair salons, not everyone is ready to head out in public for a haircut just yet. That means many people around the world are still sporting shaggy quarantine cuts.

To tame his wild mane, then, Shane Wighton, an engineer and YouTuber known for his channel, Stuff Made Here, has built the ultimate hairstylist: a robotic barber.

What AI researchers can learn from the self-assembling brain

But one idea that hasn’t gotten enough attention from the AI community is how the brain creates itself, argues Peter Robin Hiesinger, Professor of Neurobiology at the Free University of Berlin (Freie Universität Berlin).

In his book The Self-Assembling Brain, Hiesinger suggests that instead of looking at the brain from an endpoint perspective, we should study how information encoded in the genome is transformed to become the brain as we grow. This line of study might help discover new ideas and directions of research for the AI community.

The Self-Assembling Brain is organized as a series of seminar presentations interspersed with discussions between a robotics engineer, a neuroscientist, a geneticist, and an AI researcher. The thought-provoking conversations help to understand the views and the holes of each field on topics related to the mind, the brain, intelligence, and AI.

/* */