Can a Chatbot Make an Error?

This is a case out of British Columbia where an automated response system (a Chatbot) made an error:

The grandmother of an individual who resides in British Columbia, passed away in Ontario. The bereaved asked the Air Canada Chatbot if he could apply for bereavement fares retroactively. The Chatbot said yes.

With this information in hand, he booked flights at regular rates and submitted an application for a partial refund at bereavement rates. Air Canada rejected his submission. In doing so, Air Canada referred the individual to their Bereavement Travel policy on their website which indicated that bereavement fares could not be applied retroactively. The individual argued with the airline for months. Air Canada initially offered a $200 flight voucher and a promise to update the Chatbot. This was not acceptable to the complainant and he filed a small claims complaint in British Columbia.

There are several things that must be proved by the plaintiff (the bereaved person):

  • that the defendant (Air Canada in this case) owed a duty of care as a service provider and did not ensure that their Chatbot was accurate.
  • the defendant’s representation via the Chatbot was untrue, inaccurate or misleading
  • the defendant made the representation negligently
  • the plaintiff reasonably relied on the information
  • the plaintiff’s reliance on the information resulted in damages

Air Canada argued that the Chatbot was a separate legal entity. The Tribunal disagreed, indicating that the Chatbot is on the website and that Air Canada “is responsible for all the information on its website”.

The Chatbot did provide a link to the webpage of the Bereavement Travel policy, but the Tribunal found that customers should not be required to double check information found on one part of a company’s website with another part of its website. It also asked why the webpage of the Bereavement Travel policy was “inherently more trustworthy than its Chatbot.”

The Tribunal required Air Canada to honour the bereavement fare rates for the plaintiff retroactively. Air Canada was also made responsible for pre-judgment and post-judgment interest and the reimbursement of the plaintiff’s fees in relation to the Tribunal proceedings.

It appears that Air Canada did not have disclaimers or terms and conditions on their website that might have helped with their side of the argument. This is particularly important knowing that Chatbots may provide incorrect information and from time to time have “hallucinations”. In the world of Artificial Intelligence, Large Language Models and Chatbots, hallucinations refer to a Chatbot generating fabricated information in response to a user’s prompt presenting the information as if it’s factual and correct. 

In summary:

The unhappy story for Air Canada in the above is that their Chatbot was not programmed to provide the right information to customers.

We can project that there will be more cases like this in the future. Companies can be held liable for output from their automated Chatbots, computer systems and websites. They can also be held liable for providing inaccurate information to their employees.

When it comes to asking HR questions to a Chatbot, please proceed with caution. Verify that the output of AI-generated information is timely, accurate and related to Canada, specifically Ontario. It is critically important to understand overall context and background of the topic being queried. Large language models are helpful to our businesses, but there is nothing like the nuanced understanding of a human!

If I can help at all by clarifying information that you receive from an AI Chatbot, please reach out to me.

Contact Ford Keast LLP’s HR Consulting Group in London to Discuss an AI Policy for Your Company

Contact our expert HR Consultants if you require an AI policy for your business and your team members. If you have any human-resource-related queries, please contact our Senior HR Professional, Jann Danyluk or reach out by telephone at 519-679-9330 ext. 401 to support you and your business.

Ford Keast Fast Facts is made available for information-only. The intent is to provide a general understanding of HR matters. Content is not a substitute for advice related to specific employment situations. For specific advice, please get in touch with Ford Keast Human Resources directly.
Resources: Cassel Brock & Blackwell LLP, McCarthy Tetrault LLP, Moffatt v Air Canada, Simply Flying 
February 2024

News & Events

Office Location

624 Maitland Street
London, Ontario N6B 2Z9
Directions
View Office Hours
T (519) 679-9330
F (519) 679-3204

    RSM Canada Alliance provides its members with access to resources of RSM Canada Operations ULC, RSM Canada LLP and certain of their affiliates (“RSM Canada”). RSM Canada Alliance member firms are separate and independent businesses and legal entities that are responsible for their own acts and omissions, and each are separate and independent from RSM Canada. RSM Canada LLP is the Canadian member firm of RSM International, a global network of independent audit, tax, and consulting firms. Members of RSM Canada Alliance have access to RSM International resources through RSM Canada but are not member firms of RSM International. Visit rsmcanada.com/aboutus for more information regarding RSM Canada and RSM International. The RSM trademark is used under license by RSM Canada. RSM Canada Alliance products and services are proprietary to RSM Canada.