Toggle light / dark theme

AI Guides Robot on the ISS for the First Time

Dr. Somrita Banerjee: “This is the first time AI has been used to help control a robot on the ISS. It shows that robots can move faster and more efficiently without sacrificing safety, which is essential for future missions where humans won’t always be able to guide them.”


How can an AI robot help improve human space exploration? This is what a recent study presented at the 2025 International Conference on Space Robotics hopes to address as a team of researchers investigated new methods for enhancing AI robots in space. This study has the potential to help scientists develop new methods for enhancing human-robotic relationships, specifically as humanity begins settling on the Moon and eventually Mars.

For the study, the researchers examined how a technique called machine learning-based warm starts could be used to improve robot autonomy. To accomplish this, the researchers launched the Astrobee free-flying robot to the International Space Station (ISS), where its algorithm was tested floating around the ISS in microgravity. The goal of the study was to ascertain if Astrobee could navigate its way around the ISS without the need for human intervention, relying only on its algorithm to determine safely traversing the ISS. In the end, the researchers found that Astrobee successfully navigated the tight terrain of the ISS with limited need for human intervention.

RAM is so expensive, Samsung won’t even sell it to Samsung

More hints that the Singularity really has begun: and more importantly: https://arstechnica.com/space/2025/12/after-years-of-resisti…ublic-why/

The second article is how Elon is going to have SpaceX go public at $1.5 trillion so he has more money to put into AI. Of course, Elon is not the only one putting money into AI and $1 trillion will be spent on AI data centers next year.


Due to rising prices from the “AI” bubble, Samsung Semiconductor reportedly refused a RAM order for new Galaxy phones from Samsung Electronics.

Pneumatic-suction robot clears 75,000 lb of cargo an hour

It’ll likely be a while before we have humanoid robots taking over our household chores, but what you can count on sooner is seeing more robots in industrial settings, like factories and warehouses.

Robots already move pallets and bins of goods across warehouse floors, replacing forklifts. There are also articulated arms involved in packaging tasks, and even assembly operations.

A startup founded by Massachusetts Institute of Technology (MIT) alumni wants these bots to do some heavy lifting, literally. Pickle Robot Company’s robot systems feature AI smarts, cameras, sensors, and enormous single-armed machines to unload shipping containers filled with cases weighing up to 50 lb (22.5 kg) each.

AI identifies key mpox protein for new vaccine and antibody therapies

With the help of artificial intelligence, an international team of researchers has made the first major inroad to date toward a new and more effective way to fight the monkeypox virus (MPXV), which causes a painful and sometimes deadly disease that can be especially dangerous for children, pregnant women and immunocompromised people.

Reporting in the journal Science Translational Medicine, the team found that when mice were injected with a viral surface protein recommended by AI, the animals produced antibodies that neutralized MPXV, suggesting the breakthrough could be used in a new mpox vaccine or antibody therapy.

In 2022, mpox began to spread around the world, causing flulike symptoms and painful rashes and lesions for more than 150,000 people, while causing almost 500 deaths. Vaccines developed to fight smallpox were repurposed amid the outbreak to help the most vulnerable patients, but that vaccine is complicated and costly, due to its manufacture from a whole, weakened virus.

Tumbleweed aerodynamics inspire hybrid robots for harsh terrains

A new study published in Nature Communications details a hybrid robot that combines the wind-driven mobility of tumbleweeds with active quadcopter control, offering a new paradigm for energy-efficient terrestrial exploration.

Current terrestrial exploration lacks systems that exploit wind for mobility. Further, drag-driven robots like land sails and inflatable spheres require large sizes and complex deployment.

The researchers found the inspiration for their Hybrid Energy-efficient Rover Mechanism for Exploration Systems, or HERMES, in an unusual place.

New model frames human reinforcement learning in the context of memory and habits

Humans and most other animals are known to be strongly driven by expected rewards or adverse consequences. The process of acquiring new skills or adjusting behaviors in response to positive outcomes is known as reinforcement learning (RL).

RL has been widely studied over the past decades and has even been adapted to train some computational models, such as some deep learning algorithms. Existing models of RL suggest that this type of learning is linked to dopaminergic pathways (i.e., neural pathways that respond to differences between expected and experienced outcomes).

Anne G. E. Collins, a researcher at University of California, Berkeley, recently developed a new model of RL specific to situations in which people’s choices have uncertain context-dependent outcomes, and they try to learn the actions that will lead to rewards. Her paper, published in Nature Human Behaviour, challenges the assumption that existing RL algorithms faithfully mirror psychological and neural mechanisms.

AI headphones automatically learn who you’re talking to—and let you hear them better

Holding a conversation in a crowded room often leads to the frustrating “cocktail party problem,” or the challenge of separating the voices of conversation partners from a hubbub. It’s a mentally taxing situation that can be exacerbated by hearing impairment.

As a solution to this common conundrum, researchers at the University of Washington have developed smart headphones that proactively isolate all the wearer’s conversation partners in a noisy soundscape. The headphones are powered by an AI model that detects the cadence of a conversation and another model that mutes any voices that don’t follow that pattern, along with other unwanted background noises. The prototype uses off-the-shelf hardware and can identify conversation partners using just two to four seconds of audio.

The system’s developers think the technology could one day help users of hearing aids, earbuds and smart glasses to filter their soundscapes without the need to manually direct the AI’s “attention.”

Infant-inspired framework helps robots learn to interact with objects

Over the past decades, roboticists have introduced a wide range of advanced systems that can move around in their surroundings and complete various tasks. Most of these robots can effectively collect images and other data in their surroundings, using computer vision algorithms to interpret it and plan their future actions.

In addition, many robots leverage large language models (LLMs) or other natural language processing (NLP) models to interpret instructions, make sense of what users are saying and answer them in specific languages. Despite their ability to both make sense of their surroundings and communicate with users, most robotic systems still struggle when tackling tasks that require them to touch, grasp and manipulate objects, or come in physical contact with people.

Researchers at Tongji University and State Key Laboratory of Intelligent Autonomous Systems recently developed a new framework designed to improve the process via which robots learn to physically interact with their surroundings.

/* */