The legal status of robots has emerged as a source of debate and controversy in Europe regarding the Robotics and Artificial Intelligence directive. The controversy is situated in an expression that includes a report of the European Parliament where there is talk of guaranteeing robots that they can learn by themselves from an “electronic personality”.
The growing expansion of artificial intelligence it involves many changes. One of the most important, already seen by the European institutions, is of a legal nature. If there are increasingly sophisticated robots, if systems are able to learn on their own, if they are increasingly autonomous, perhaps it is time to evolve the concept of machine.
Until now, a machine was still a product produced by a manufacturer. It is true that it could contain software that governed it and that this might not have been developed by the same manufacturer. But the positioning of an industrial machine, a car or a computer was clear.
However, the introduction of artificial intelligence leads us to some machines being able to make decisions autonomously. In the future, these actions they are likely to be carried out without the supervision of a human supervisor. But in addition, these systems are now able to learn for themselves. Machine learning and deep learning are popularizing technologies capable of training and improving over time.
These two aspects contribute a significant leap in autonomy with respect to what the machines did so far. And both components, applied to robotics-they are increasingly capable of doing new jobs–, have sown the dilemma among experts. Should robots be given legal personality?
The European Commission is preparing a robotics and Artificial Intelligence directive that has aroused strong criticism. The controversy lies in an expression contained in a report by the European Parliament. It speaks of ensuring robots can learn on their own from an “electronic personality”.
This terminology would summarize the entire legal status of robots. And is that, the recognition of an”electronic personality” it involves recognizing an individuality and, consequently, a legal responsibility. However, in a letter sent to the president of the European Commission, Jean-Claude Juncker, more than 200 specialists in the field from 14 European countries have expressed their opposition to this measure.
The legal status of robots, formulated in this way, would mean that for any harm done by an autonomous robot, it could be held individually liable. Basically, it would be equate autonomous robots with the legal personality that companies also have. They may be held legally liable if they cause damage or incur misconduct for their activity.
Experts in artificial intelligence, on the other hand, emphasize that this formula of electronic personality would void manufacturers ‘ liability. They point out that this legal status does not have to do with the guarantee of rights but with the management of obligations.
Images: Ryan Somma, mikecogh