Brussels wants to better protect victims of damage linked to artificial intelligence

Posted Sep 28, 2022, 4:25 PMUpdated on Sep 28, 2022 at 5:01 PM

A drone crashing into the roof of a house. A self-driving car that knocks over a pedestrian. Who is responsible when a technology causes damage and what compensation for injured parties? The European Commission intends to provide the answer to these complex questions with a proposal for a directive enacting new rules of liability for artificial intelligence (AI), unveiled on Wednesday.

Having become a key technology in recent years, AI has invaded our lives. It appears in all fields, from transport to the granting of credits through fitness programs, health care or nutrition…

However, there is currently no specific legal framework in terms of liability devoted to AI. The European citizen therefore finds himself very deprived of obtaining compensation when damage occurs. And there are more and more of them, as illustrated, for example, by the recent emblematic case, this summer, in Russia, of the chess-playing robot which broke the finger of a seven-year-old child.

“New technologies such as drones, for example, can only work if the consumer feels protected and safe, which the current rules are not able to ensure”, explained Didier Reynders, the Commissioner responsible for the Justice, specifying that the directive will cover “all types of damage” and “make it easier for victims to initiate proceedings”. The Commission is clear: the European citizen must have the same level of compensation and protection as in cases not involving AI.

Reduce the level of burden of proof

Concretely, the text provides for simplifying the legal process so that a victim can prove more easily that the fault of a person caused damage. Because if it is not able to provide this evidence, it cannot benefit from compensation… However, it is not always easy to determine responsibility for an accident when a connected object or a robot is involved.

It also lightens the level of burden of proof for the victim by introducing the idea of ​​“presumption of causation” in the case of a fault where the causal link with the performance of the AI ​​“seems reasonably probable”. “The objective is to avoid the victim having to describe and demonstrate the malfunction of the AI, that is to say in a way what is in the black box”, explains a senior European official . This could be useful in cases of discrimination during a recruitment process using AI technology, for example.

Right of access to company and supplier information

And that’s not all. The new directive also grants a right of access to information from companies and suppliers on high-risk artificial intelligence systems to better determine where liability comes from in the event of damage. If it is due to a fault or an omission, for example, of the supplier, the developer or an AI user.

“If a sidewalk cleaning robot hits a stroller and injures a baby, parents can demand that the manufacturer shed full light on the way the robot was designed and the safety precautions taken, illustrates a senior official . If he does not provide this information, faulty behavior will be presumed and he will be held liable for the damage. »

“Companies’ secret commercial data will be protected”, assure the designers of the directive. In the future, the victims should thus be able to turn against “all the actors in the supply chain and not just the manufacturers”, specified Didier Reynders.

This AI directive – which still needs to be adopted by the European Parliament and the Council – complements another directive concerning the strict liability of manufacturers for defective products (garden chairs, medicines, cars, AI-based products) which, dating from 1985, will be modernized.

With these new rules, the Commission also hopes to encourage companies to comply with stricter security rules, and thus to strengthen public confidence in AI, which would promote its development.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *