Project Duration: 2023.09 - present
In this work, we use imitation learning methods such as GAIL to enable legged robots to acquire natural movements by imitating from demonstration motion clips. With these techniques, we implemented a total of three different tasks on the legged robots BRUCE and Unitree AlienGo: single-clip imitation, velocity-controllable omnidirectional locomotion, and multiple motion skills integration.
This task is done based on IsaacGymEnvs, using GAIL to enable the bipedal robot BRUCE to mimic from a single clip of reference motion. Where the reference motion clips are retargeted from the CMU Motion Capture Dataset to BRUCE.
Other bipedal robots:
In this task, we provide the bipedal robot BRUCE with a series of reference locomotion motion clips, including walking forward, walking leftward, steering, and jogging, etc. BRUCE learns velocity-controllable omnidirectional locomotion from these reference motion clips, where the forward velocity range is [-0.5, 1.5] m/s and the lateral velocity range is [-0.5, 0.5] m/s, and the yaw angular velocity range is [-0.8, 0.8] rad/s.
In this task, we use imitation learning to enable legged robots to integrate multiple natural motion skills into a single reinforcement learning policy from a given set of reference motion clips.
Although imitation learning can provide a good specification for motion learning in legged robots, we still often observe unnatural phenomena such as violent jittery and sliding steps (these phenomena are also occurring in the above demos), in the policies obtained from training. In the future, we will continue to explore these phenomena in depth, and try to propose some novel methods to allow the robot to learn movements more natural.