Air Canada had to refund the ticket after the chatbot made the policy

Canada was forced to issue a partial refund to a passenger who was misinformed by the airline’s chatbot and incorrectly explained the refund policy in the event of a trip due to a death.

When Jake Moffatt’s grandmother died, he immediately went to Air Canada’s website to book a flight from Vancouver to Toronto. Moffatt wasn’t sure how Air Canada’s death fees work and asked the airline’s chatbot for an explanation.

The chatbot provided inaccurate information and encouraged Moffatt to book a flight immediately and then request a refund within 90 days. In fact, Air Canada’s policy states that the airline will not provide travel refunds in the event of a death after booking the flight. Moffatt followed the chatbot’s advice and requested a refund, but was surprised to see the request rejected.

For months, Moffatt tried to convince Air Canada that it was due a refund by posting a screenshot of the chatbot that clearly stated:

If you need to travel immediately or have already traveled and would like to submit your ticket for a discounted bereavement fare, please do so within 90 days of the date your ticket was issued by completing our ticket refund request form.

Air Canada argued that because the chatbot’s response elsewhere linked to a page with actual fatality travel policies, Moffatt should have known that fatality airfares cannot be claimed retroactively. Instead of a refund, Air Canada’s biggest offer was to update the chatbot and give Moffatt a $200 voucher for a future flight.

Unsatisfied with this solution, Moffatt rejected the coupon and filed a complaint with Canada’s Civil Resolution Tribunal.

According to Air Canada, Moffatt should never have trusted the chatbot, and the airline should not be held responsible for the chatbot’s misleading information, as Air Canada essentially argued it was “The chatbot is a separate legal entity that is responsible for its own actions.”” says a court document.

This is how tribunal member Christopher Rivers, who ruled for Moffatt, explained his verdict.

Air Canada argues that it cannot be held responsible for information provided by any of its agents, servants or representatives, including a chatbot. He doesn’t explain why he believes this is the case or why the website titled “Grief Journey” would be inherently more trustworthy than his chatbot.

Additionally, Rivers found that Moffatt had “no reason” to believe that one part of Air Canada’s website was accurate and another was not.

Ultimately, Rivers made the decision Moffatt was entitled to a partial refund of 651 Canadian soles on the original fare. That was $1,640 Canadian, plus additional damages to cover Moffatt’s flight interest and legal fees.

The support chatbot is currently unavailable, indicating that Air Canada has disabled the chatbot.

Experts believe Air Canada would have managed to avoid liability in the Moffatt case if its chatbot had warned customers that the information provided by the chatbot might not be accurate. Since Air Canada apparently did not take this step, Rivers decided to do so “Air Canada did not take adequate care to ensure its chatbot was accurate.”

“Air Canada should be clear that it is responsible for all information on its website.”Rivers wrote. “It doesn’t matter whether the information comes from a static page or a chatbot.”

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here