The artificial intelligence they have introduced in ANYmal allows you to maintain balance in risky situations, such as when getting kicked.
One of the most complex aspects of robots inspired by biological forms is motor skills. A car, for example, is a machine prepared to move like a machine. It is designed so that its construction is simple. The car adapts to the idea of the engineer, while in a humanoid or dog-based robot it is the engineer who has to look for the formula to adapt to a given morphology.
The movements of animals have millions of years of evolution and so have their tissues. Trying to reproduce similar schemes with materials such as plastic or metal is complicated. That’s why Boston Dynamics gave a twist to their line of humanoid robots. One of them was put on wheels instead of feet and its mobility seemed far superior to that of other models.
But the imitation of nature has many advantages, so it is a branch that is still in exploration. One of the latest milestones belongs to ANYbotics, a spin-off of the Federal Institute of Technology in Zurich. His merit is to have created a robotic dog able to hold when kicked.
Thus enunciated, its virtue seems childish, but behind it there is a work of enormous precision. The company’s researchers started by creating all the possibilities in a simulation environment. The robotic dog would be subjected to all kinds of kicks, from the side and from behind, to knock him down. They would also do so on various grounds.
By means of a machine learning technique (reinforcement learning) they achieved that the simulated robot sharpen your balance and your ability to sustain yourself after a hit. The robot tried to keep balance over and over again, learning in every action.
Reinforcement learning
Actually the technique used by researchers in simulation comes from psychology. When a person does something right, you have to ref orzar positively that action with a prize. If you do it wrong, the action will be censored, to avoid it in the future. It is a simple scheme, which is not so easy to apply. But it is the one that researchers have used to make the simulation learn to hold.
Already considered an area of machine learning, reinforcement learning allows polishing certain behaviors. The advantage ANYmal has had for learning is the simulation environment. The accuracy of this software has allowed that when transferring knowledge to the robot , these are applied well.
The resilience of a robot is an important issue, while machines will have to be subject to contingencies. Once again the Boston Dynamics company, seasoned in the development of humanoids, already showed how a robot was able to learn to stand up after having fallen to the ground.
Images: ANYbotics