In a landmark decision, Air Canada has been mandated to compensate a passenger with hundreds of dollars after receiving incorrect information from the airline’s online chatbot. This case highlights the growing concerns around the reliability of artificial intelligence in customer service and the legal responsibilities of companies.
Jake Moffatt sought recompense through a small-claims tribunal when Air Canada refused a refund for his Vancouver to Toronto flight tickets. Moffatt’s journey was prompted by the unfortunate demise of his grandmother in November last year, leading him to inquire about Air Canada’s bereavement fares—a discounted rate for those traveling due to the loss of an immediate family member. Moffatt interacted with the airline’s chatbot on its official website, which inaccurately advised him he could claim a bereavement discount post-purchase within 90 days.
Relying on this guidance, Moffatt booked his flights, totaling over CA$1,640. However, despite following the chatbot’s advice and submitting the necessary documentation for a refund within the specified period, Air Canada denied his request. The refusal was based on the airline’s policy that bereavement fare rates cannot be retroactively applied after the purchase of tickets—a stark contradiction to the information provided by the chatbot.
This incident escalated to a tribunal where Moffatt argued Air Canada’s negligence and misinformation resulted in unnecessary financial strain. The tribunal, led by member Christopher Rivers, scrutinized Air Canada’s defense that it should not be held accountable for the chatbot’s misinformation. Rivers’ judgement emphasized the airline’s responsibility for all content on its website, dismissing the notion that a chatbot could be considered a separate legal entity from the airline.
Air Canada’s attempt to redirect responsibility by noting the chatbot provided a link to more detailed fare information was also critiqued. Rivers pointed out the inconsistency in expecting customers to verify chatbot information against other website sections. The tribunal concluded Air Canada failed to ensure the accuracy of its chatbot, leading to the order for the airline to compensate Moffatt CA$812.02, including damages.
This case serves as a cautionary tale about the reliance on automated chatbots for critical customer service functions. It underscores the necessity for companies to rigorously test and monitor AI tools to prevent misinformation. As AI continues to integrate into customer service, ensuring the accuracy of these automated systems is paramount to maintain trust and avoid legal repercussions.
Air Canada has agreed to comply with the tribunal’s decision, marking the case as resolved. This incident reinforces the importance of accurate AI-powered customer service and the potential legal implications of failing to provide reliable information.
This scenario is a reminder of the evolving landscape of customer service, where the accuracy of AI chatbots is under scrutiny, and companies are called to uphold high standards of information reliability.