1. Field of the Description
The present description relates, in general, to legged robots (e.g., biped humanoid robots or other legged robots such as quadrupeds) that may be implemented as floating-base humanoid robots (e.g., with no link or attachment to a support) and to control of force-controlled joints of such robots. More particularly, the present description relates to methods for controlling floating-base humanoid robots considering strict contact force constraints (and to robots operating with a controller implementing such control methods).
2. Relevant Background
A biped humanoid robot is a robot with a structure similar to that of the human body. Humanoid robots have been designed for providing interaction with various environments such as tools and machines that were made for humans and often are adapted for safely and effectively interacting with human beings. In general, humanoid robots have a torso with a head, two arms, and two legs each with some form of foot such that the robot can walk on planar surfaces, climb steps, and so on (e.g., these humanoid robots are “bipeds” as are humans). Humanoid robots may be formed with many rigid links that are interconnected by joints that are operated or positioned by applying a force or torque to each joint to move and position a robot. Similarly, other legged robots such as those with three, four, or more legs also may walk utilizing force-controlled movement of their legs.
In order to interact with human environments, humanoid robots require safe and compliant control of the force-controlled joints. In this regard, a controller is provided for each robot that has to be programmed to determine desired motions and output forces (contact forces) and, in response, to output joint torques to effectively control movement and positioning of the humanoid robot. However, it has often proven difficult to achieve desired results with force-controlled robots because while performing a task in a complex environment the robot may encounter uneven ground, static and dynamic obstacles, and even humans. The robot has to continue to be balanced as it stands in one location and also as it steps and moves within the environment. Humanoid robots are expected by many to work with humans in home and office environments. Hence, it would be more desirable for such robots to have human-like motions so that human co-workers can easily infer the robot's intentions and predict future movements for safe and smooth interactions.
However, programming of controllers and control methods for humanoid robots has not proven to be straightforward because the robots tend to have complex structures made up of many joints. One approach to addressing this control problem is to try to teach the motions to the robots through human demonstration, which is often referred to as learning from demonstration or imitation learning. This approach allows a programmer to simply demonstrate the motion with a human actor or model while the robot observes the motion. A learning algorithm takes the observed motions of the human actor and then makes adjustments to the motion so that the robot can achieve the task using its own body (with its joints, drivers/actuators, and the like).
Unfortunately, most of the work based on this training approach has only considered the kinematics of motions and, therefore, cannot be directly applied to robots and motions that require balancing. In other words, these human motion tracking approaches are not directly useful for controlling robots, such as floating-base humanoid robots, to stand or to walk because these motions require balancing over a support surface (e.g., the ground over which the robot is walking while being controlled).