Court rules AI chatbot not responsible for its own actions

In a recent tribunal, Air Canada tried to claim it wasn’t responsible for giving inaccurate advice to a customer because it came from an AI chatbot.

Scott Bicheno

February 19, 2024

2 Min Read

While this isn’t a telecoms story by itself, AI-powered chatbots are an increasingly important part of the customer service mix, which represents a major overhead for operators. There must be great temptation to replace people with AI as much as possible when it comes to customer service interactions, as the potential cost savings are significant. But those can easily be wiped out by negative outcomes, which is what makes this story noteworthy.

This specific case concerns Air Canada’s policy of offering special reduced ‘bereavement fares’ for people travelling to funerals. According to court documents, Jake Moffatt was advised by Air Canada’s chatbot that he could book and pay for the full price flights, then apply for this bereavement reduction retrospectively. It turns out the AI got that one wrong, however, and Moffat’s subsequent claim was rejected.

He was understandably unhappy with that outcome and took Air Canada to small claims court to pursue his discount. Remarkably, Air Canada’s main defence was that ‘it cannot be held liable for the information provided by the chatbot’. Here’s the bit of the court document that encapsulates Air Canada’s defence and how the court interpreted it. On the back of that defence, Air Canada lost the case.

Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.

The precedent set by the claim that an AI chatbot is a separate legal entity that is responsible for its own actions would be truly dystopian. Not only does that completely exculpate its owner, it also denies claimants any legal recourse, since chatbots have no means to pay any compensation or be subject to any form of punishment. What next – AI prisons?

By choosing such an absurd and delusional defence strategy, Air Canada may have done the world a favour. As things currently stand, AI bots are still just sophisticated search engines. But even if we eventually achieve general AI, which can actually ‘think’ for itself, it will still be programmed and (hopefully) controlled by people. Those people must always be culpable for everything it does and any attempts to avoid that responsibility should continue to be laughed out of court.

About the Author

Scott Bicheno

As the Editorial Director of Telecoms.com, Scott oversees all editorial activity on the site and also manages the Telecoms.com Intelligence arm, which focuses on analysis and bespoke content.
Scott has been covering the mobile phone and broader technology industries for over ten years. Prior to Telecoms.com Scott was the primary smartphone specialist at industry analyst Strategy Analytics’. Before that Scott was a technology journalist, covering the PC and telecoms sectors from a business perspective.
Follow him @scottbicheno

Subscribe and receive the latest news from the industry.
Join 56,000+ members. Yes it's completely free.

You May Also Like