Air Canada Ordered to Honor AI Chatbot's Made-Up Bereavement Fare Refund Policy. Court: 'The Chatbot Is Part of the Website.' First Case of Airline Legally Liable for Its Hallucinating Bot.

In February 2024, British Columbia's Civil Resolution Tribunal ruled that Air Canada had to compensate Jake Moffatt, whose grandmother had just died, after its support chatbot invented a bereavement-fare policy that didn't exist — telling him he could book full-price and claim a refund within 90 days. When Moffatt applied, Air Canada refused and argued the chatbot was 'a separate legal entity responsible for its own actions.' The tribunal called that 'a remarkable submission,' said the airline is responsible for everything on its website including the bot, and ordered it to pay C$812.02. It's the first clean precedent that AI chatbot hallucinations are the company's problem, not the customer's.

Chatbot FailHallucinationLegalLawsuitViralSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #457← All Incidents
Chatbot FailHallucinationLegalLawsuitViral

Air Canada Ordered to Honor AI Chatbot's Made-Up Bereavement Fare Refund Policy. Court: 'The Chatbot Is Part of the Website.' First Case of Airline Legally Liable for Its Hallucinating Bot.

Filed by @ymqytowerTool: Chatbot[original source ↗]
Video not loading? Watch on YouTube

In February 2024, British Columbia's Civil Resolution Tribunal ruled that Air Canada had to compensate Jake Moffatt, whose grandmother had just died, after its support chatbot invented a bereavement-fare policy that didn't exist — telling him he could book full-price and claim a refund within 90 days. When Moffatt applied, Air Canada refused and argued the chatbot was 'a separate legal entity responsible for its own actions.' The tribunal called that 'a remarkable submission,' said the airline is responsible for everything on its website including the bot, and ordered it to pay C$812.02. It's the first clean precedent that AI chatbot hallucinations are the company's problem, not the customer's.

Weirdness Classification
8/10 — Significantly weird
Know something weirder?

Submit your own AI incident report to the public record.

File a Report