Air Canada must honor refund policy invented by airline’s chatbot

Key Points:

  • Air Canada was forced to give a partial refund to a passenger due to misleading information provided by its chatbot regarding bereavement travel policy.
  • The passenger, Jake Moffatt, filed a small claims complaint, leading to a ruling in his favor for a partial refund and additional damages.
  • Tribunal member Christopher Rivers criticized Air Canada’s defense, stating that the airline should take responsibility for all information on its website, including that provided by its chatbot.

“`

Summary:

Air Canada was compelled to offer a partial refund to a grieving passenger who received erroneous information from the airline’s chatbot regarding bereavement travel policies. The passenger, Jake Moffatt, sought clarification on Air Canada’s bereavement rates after his grandmother’s passing and was incorrectly advised by the chatbot to book a flight and request a refund within 90 days.

 

Despite Moffatt’s efforts to follow the chatbot’s guidance, Air Canada refused the refund, providing only a $200 coupon for future travel. Moffatt declined the offer, leading to a small claims complaint filed with Canada’s Civil Resolution Tribunal. Air Canada attempted to absolve itself of responsibility, arguing that the chatbot should be considered a separate legal entity accountable for its actions.

 

In a groundbreaking decision, the Tribunal ruled in favor of Moffatt, deeming Air Canada liable for the chatbot’s inaccuracies. Tribunal member Christopher Rivers criticized Air Canada’s defense, highlighting the inconsistency in the airline’s information across different platforms. Rivers ordered a refund of $650.88 CAD and additional compensation for interest and tribunal fees.

 

Following the ruling, Air Canada confirmed compliance, signaling the resolution of the matter. Interestingly, their chatbot support appeared to be disabled, suggesting a potential reassessment of their AI experimentation. The chatbot was initially introduced to streamline customer service, particularly during flight disruptions, with ambitions to handle more complex issues over time.

 

Despite Air Canada’s strategic investment in AI technology for cost-saving and customer experience enhancement purposes, the Moffatt case emphasized the need for accurate information dissemination through such platforms. Experts underscored that Air Canada could have mitigated liability if the chatbot had cautioned users about the potential inaccuracies in its responses. Rivers emphasized Air Canada’s obligation to ensure the accuracy of all information provided on its platforms, regardless of the source.

 

The Moffatt case serves as a pivotal precedent in raising accountability standards for companies using AI-driven systems, underscoring the importance of accurate and transparent customer interactions in the digital age.

DAILY LINKS TO YOUR INBOX

PROMPT ENGINEERING

Prompt Engineering Guides

ShareGPT

 

©2024 The Horizon