Air Canada was condemned for incorrect Chatbot information

Air Canada was condemned for incorrect Chatbot information

A Air Canada had to pay compensation to a grieving grandson. This is because he claimed that he purchased, through induction from the platform, airline tickets at full price through a chatbot misinformed.

O Civil Resolution Court (CRT) of British Columbia held the airline responsible for the poor advice. So a chatbot on the company’s website was responsible for the damage, which meant a passenger was unable to claim a bereavement fare.

In an argument that surprised a small claims judge in British Columbia, the airline tried to distance itself from the bad advice of its own chatbot, claiming the online tool was “a separate legal entity responsible for its own actions.”

“This is a remarkable presentation”he wrote Christopher Rivers, member of the Civil Resolution Court (CRT).

“While a chatbot has an interactive component, it is still just one part of the Air Canada website. It should be obvious to Air Canada that it is responsible for all information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

‘Deceptive words’

In a ruling released this week, Rivers ordered Air Canada to pay the Jake Moffatt $812 to cover the difference between the airline’s bereavement fees and the $1,630.36 they paid for full-price round-trip tickets to Toronto purchased after her grandmother passed away.

Support chatbot

Moffatt’s grandmother died on Remembrance Day 2022. Moffatt visited the Air Canada website the same day.

Jake Moffatt claimed to have purchased full-price tickets to Toronto and back based on advice from a chatbot that they could retroactively make a bereavement claim. “While using the Air Canada website, they interacted with a support chatbot”says the decision.

Moffatt provided CRT with a screenshot of the chatbot’s words:

“If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement fare, please do so within 90 days of the date your ticket was issued by completing our Request for Bereavement form. Ticket Refunds”.

Based on this assurance, Moffatt claimed to have booked full-price round-trip tickets to Toronto.

But when they contacted Air Canada to get their money back, they were told that bereavement fees do not apply to completed trips – something explained elsewhere on their website.

Moffatt sent a copy of the screenshot to Air Canada – pointing out the chatbot’s advice to the contrary.

“An Air Canada representative responded and admitted that the chatbot had provided ‘misleading words’”wrote Rivers.

“The representative highlighted the chatbot link to the Bereavement Travel webpage and said that Air Canada had noted the issue so they could update the chatbot.”

Apparently, Moffatt found this little comforting – and chose to sue.

Misleading Chatbot: ‘Reasonable Care’ Didn’t Exist to Ensure Accuracy

According to the ruling, Air Canada argued that it cannot be responsible for the information of one of its “agents, servants or representatives – including a chatbot.”

But Rivers noted that the airline “does not explain why he believes this to be the case.”

“I conclude that Air Canada did not take reasonable care to ensure that its chatbot was accurate”concluded Rivers.

Air Canada argued that Moffatt could have found the correct information about bereavement fares elsewhere on the airline’s website.

But as Rivers pointed out, “this doesn’t explain why the web page titled ‘Mourning Trips’ was inherently more trustworthy than their chatbot.”

“There is no reason why Mr. Moffatt should know that one section of the Air Canada website is accurate and another is not”wrote Rivers.

Research by the Legal Information Institute of Canada – which maintains a database of Canadian legal decisions – shows a dearth of cases with poor advice from chatbots; Moffatt’s appears to be the first.

Source: Atrevida

You may also like