Watch humanoid robots dance seamlessly with humans thanks to AI-powered motion tracking software update

ExBody2: Advanced Control Over the Whole Human Body with an Expressive Humanoid – YouTube

See more

In the near future, humanoid robots will be able to move much more naturally – and even dance like us – thanks to a new software platform for tracking human movements.

Developed by scientists from the University of California at San Diego, the University of California at Berkeley, the Massachusetts Institute of Technology and Nvidia, ExBody2 is an innovative technology that enables humanoid robots to perform realistic movements based on detailed scanning and visualization of human movements.

The researchers hope that in the future, humanoid robots will be able to perform a much wider range of tasks by more accurately imitating human movements. For example, this learning method could help robots perform roles that require precise movements, such as retrieving items from shelves or carefully moving around people or other machines.

ExBody2 works by taking simulated movements based on human motion capture and converting them into movement data suitable for robotic replication. This platform can replicate complex movements, allowing robots to move less rigidly and adapt to different tasks without the need for extensive retraining.

All of this is done through reinforcement learning, a subset of machine learning in which the robot is given a ton of data to ensure that it chooses the best path in different situations. The good results the researchers get are given positive or negative scores to “reward” the model for the desired results, which in this context meant accurately reproducing the movements without compromising the robot’s stability.

The platform is also capable of processing short videos, such as a few seconds of dancing, and synthesizing new frames of movement for reference so that robots can perform longer movements.

Dancing with robots

In a video posted on YouTube, a robot trained using ExBody2 dances, sparred, and performed physical exercises next to a human. The robot also mimicked the researcher’s movements in real time using additional code called “HybrIK: Hybrid Analytical-Neural Inverse Kinematics for Body Mesh Recovery,” developed by the Machine Vision and Intelligence Group at Shanghai Jiaotong University.

At the moment, the ExBody2 dataset mainly focuses on upper-body movements. In a study uploaded to the preprint server ArXiv on December 17, 2024, the researchers behind the platform explained that this is due to concerns that introducing too much lower-body movement could lead to instability.

“Tasks that are too simple may limit the learning model’s ability to generalize to new situations, while tasks that are too complex may exceed the robot’s capabilities, leading to ineffective learning results,” they noted. “Therefore, part of our dataset preparation involves excluding or modifying recordings that contain complex lower-body movements that are beyond the robot’s capabilities.”

The researchers' dataset includes more than 2,800 movements, 1,919 of which were extracted from the Archive of Motion Capture As Surface Shapes (AMASS) dataset. This is a large dataset of human movements, containing more than 11,000 individual movements and 40 hours of detailed motion data, designed for non-profit deep learning – where a neural network is trained on large amounts of data to identify or reproduce patterns.

Having confirmed the effectiveness of ExBody2 in reproducing human-like movements in humanoid robots, the team now aims to tackle the challenge of achieving these results without having to manually curate datasets to ensure that only relevant information is available to the platform. The researchers suggest that automated dataset collection could help streamline this process in the future.

Sourse: www.livescience.com

Leave a Reply

Your email address will not be published. Required fields are marked *