Construction and Infrastructure Law in Canada Blog

Chatbots: who could be liable for the accuracy of the output?

Mar 1, 2024 3 MIN READ
Authors
Emma Smith

Associate, Disputes, Toronto

Paul Ivanoff

Partner, Disputes, Toronto

abstract internet web

The use of artificial intelligence (AI) continues to rapidly grow in many spheres of our day-to-day lives. Whether for logistical purposes, customer service, or the design and construction of a complex project, the use of AI is expanding at an incredible pace. This rapidly evolving field presents unique legal challenges, including questions of liability.

Despite the growing application of AI, there is a dearth of Canadian case law pertaining to the use of AI and any attendant liabilities. A recent decision of the British Columbia Civil Resolution Tribunal (Small Claims), released on February 14, 2024, addressed the liability of a company for the output of a chatbot on its website.

In Moffatt v. Air Canada,[1] there was a dispute between a passenger and Air Canada over a refund for a bereavement fare. Mr. Moffatt booked flights with Air Canada, after the passing of his grandmother, relying on the information from a chatbot on Air Canada’s website that stated he could apply for a reduced bereavement rate retroactively. After purchasing the tickets, Mr. Moffatt applied for the retroactive bereavement rate. Air Canada denied his request, stating the chatbot inaccurately informed him that the bereavement policy allowed retroactive applications. Mr. Moffatt filed a claim with the Civil Resolution Tribunal (the Tribunal) seeking a partial refund based on the difference in price between the regular and alleged bereavement fares.[2]

The issue before the Tribunal was whether Air Canada negligently misrepresented the procedure for claiming bereavement fares.[3]

Air Canada argued it could not be held liable for information provided by one of its agents, servants, or representatives — including a chatbot. Air Canada further argued the chatbot is a separate legal entity that is responsible for its own actions.[4] These arguments were expressly rejected by the Tribunal, stating “[w]hile a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”[5]

Given their commercial relationship as a service provider and consumer, the Tribunal found Air Canada owed the passenger, Mr. Moffatt, a duty of care. The Tribunal noted the applicable standard of care requires a company to take reasonable care to ensure their representations are accurate and not misleading.[6] The Tribunal found that Air Canada breached the duty of care owed to Mr. Moffatt as the airline failed to exercise reasonable care to ensure its chatbot was accurate.[7]

The Tribunal also rejected submissions by Air Canada that the correct information was on another part of its website. The Tribunal held customers should not have to double-check information found on one part of the website with another part of the website.[8] Ultimately, the Tribunal concluded that the company was liable for negligent misrepresentation under the circumstances.[9] 

Though this decision stems from the British Columbia Civil Resolution Tribunal (Small Claims), the case raises interesting issues regarding the use of AI. As participants in the design and construction industry expand their application of AI, courts across the country will increasingly be faced with determining the myriad of issues surrounding its use.


[1] Moffatt v. Air Canada2024 BCCRT 149 [Moffatt].

[2] Moffatt, para. 3.

[3] Moffatt, para. 11.

[4] Moffatt, para 27.

[5] Moffatt, para. 27.

[6] Moffatt, para 26.

[7] Moffatt, para 28.

[8] Moffatt, para 28.

[9] Moffatt, para. 32.