In February 2024, British Columbia's Civil Resolution Tribunal ruled that Air Canada had to compensate Jake Moffatt, whose grandmother had just died, after its support chatbot invented a bereavement-fare policy that didn't exist — telling him he could book full-price and claim a refund within 90 days. When Moffatt applied, Air Canada refused and argued the chatbot was 'a separate legal entity responsible for its own actions.' The tribunal called that 'a remarkable submission,' said the airline is responsible for everything on its website including the bot, and ordered it to pay C$812.02. It's the first clean precedent that AI chatbot hallucinations are the company's problem, not the customer's.