As the legislation is applied at present, the principle of liability is the following: laws only govern the conduct of humans and the machines they use or the pets they own. However, how is law to govern Artificial Intelligence (AI) which mimics human behavior? Who is responsible for a violation of the law by the AI?

Although this question arose at least 30 years ago in the case United-States v. Athlone Industries, the answer is still under debate on whether AI is liable for violations of the law. In the case stated, the court held that “robots cannot be sued” and discussed how the manufacturer of a batting practice pitcher machine is liable for civil penalties for the machine’s defect.

However, AI has developed in more complex ways since the 1980’s and is now part of our everyday lives. People have smartwatches, Roombas, connected fish tanks… The new hot item is automatic cars. Although some countries have published guidelines regarding automatic cars and their use, Germany and the United-Kingdom for example, there is no specific applicable legislation to AI technology. Who is responsible if an automatic car is involved in an accident? How can one prove who was driving?

Mathias Avocats presents the various solutions considered and applied.

AI considered as a good?

Traditionally, AI is considered as a good and, as such, the owner or user is responsible for any wrongdoing or harm caused by the good. The same goes for any defect while using the machine or technology. However, if the malfunctioning is due to a code error, the programmer could be held liable. In practice, this will be hard to prove namely if there is an open-source software. A solution could be to publish guidelines for programmers and clearly define the contractual duties and obligations of each party (owner, user, programmer, manufacturer…).

In this context, another question emerges: if several individuals can be held liable, are they jointly liable or joint and severally liable? And what becomes of the owner or user’s liability? One could consider that AI comes with inherent risks and that the user or owner may not be able to recover the full amount of the damage if he or she is partly responsible. Tort law offers solutions seeing as rules could be drawn from those for malfunctioning products.

It must be underlined that there is no unanimous answer. Indeed, each country will have to determine the applicable rules to AI according to its existing legislation. Moreover, the issues addressed in this article are still under debate. On February 16th 2017, the European Parliament issued a resolution with recommendations for the Commission regarding the civil law rules in relation to robotics.

The issue becomes more complex when considering deep learning AI.

An independent legal identity for AI?

AI technology is not standardized per se. Indeed, some AI technologies are capable of deep learning, which means that the machine or technology can develop new skills by learning innovative ways to parse data.

Most of the issues arise with machines capable of deep learning seeing as machines are usually inanimate objects. If a machine can undertake a new task which was not requested by its owner or user, who is responsible for the harm or damage?

If AI is deemed to have a separate legal identity, vicarious liability could apply. For example, pet owners are responsible for the harm caused by their pets. Animals are autonomous similarly to AI. However, if AI were to be granted a separate legal identity, specific legislation would have to be drafted. New rules were created for business, the same could be done with AI.

What is the next step?

There is no legal vacuum regarding AI’s liability. Current laws and case law could be applied. However, it is still a case-to-case basis and does not offer a general framework. Considering the constant evolution of AI, a specific legislation will most likely be needed.  Many countries are currently working on drafting a coherent legislation. The European Union namely published a charter on robotics. Mathias Avocats will keep you informed on any future developments.