Blog entry by Les Bell

Les Bell
by Les Bell - Monday, 26 February 2024, 9:01 AM
Anyone in the world

A hmanoid robot with a colour screen on its chest.A recent news story reveals that artificial intelligence can pose yet another risk which many will not have anticipated.

Many companies have switched to using chatbots to provide online customer support via their web sites - a chatbot will work 24/7 without breaks and, even better, without payment or complaining about working conditions. And of course, these chatbots have inevitably progressed from simple keyword-recognising rules engines to much more sophisticated conversationalists based on large language models. Ideally, these models would hardly need training - turn them loose on the company's web site, feed them a product catalog, pricing information and the related terms and conditions, and turn them loose to solve all your customer support problems.

However, a recent decision by the Civil Resolution Tribunal of British Columbia (Moffatt v. Air Canada) suggests that it's not quite that simple. Back in 2022, Jake Moffat suffered a family bereavement and needed to travel to Toronto to attend his grandmother's funeral. Some airlines offer concessions, such as reduced fares, for passengers traveling due to the death of an immediate family member (you learn something new every day). And so, Mr. Moffat went to the Air Canada web site to book his travel.

While on the site, he used the customer support chatbot, and asked it about bereavement fares. The reply (which Mr Moffat wisely captured as a screenshot) says, in part

Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.

If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form. (emphasis in original)

Now, the words "bereavement fares" were a highlighted and underlined link to Air Canada's specific web page dealing with bereavement travel - which says, in part, that the bereavement policy does not apply to requests for bereavement consideration oafter travel has been completed. Hence, the web page says one thing, but the chatbot says quite another.

In a previous telephone conversation with an Air Canada agent, Mr. Moffat was told that the bereavement fare would be approximately $C380. However, presumably relying on the information provided by the chatbot (though this is not explicitly stated), he went ahead and booked the Vancouver-Toronto flight, and a few days later booked the Toronto-Vancouver flight, for fares of $C794.98 and $C845.38, respectively.

He then set about applying for a partial refund of the fares to match the bereavement fare. Despite email exchanges through December 2022 and February 2023. Air Canada's position was that although the chatbot had provided "misleading words", it had provided the link to the bereavement travel web page and said that it would update the chatbot.

Before the Tribunal, Air Canada argued that it cannot be held liable for information provided by one of its agents, servants or representatives, and that the chatbot is a separate legal entity that is responsible for its own actions - a position the Tribunal found to be "remarkable". Holding that the chatbot is still just a part of the Air Canada web site, and that the company is responsible for all the information on that site, the Tribunal found that Mr. Moffat's claim of negligent misrepresentation was successful and awarded damages.

A few key takeways:

  • It is reasonable for a customer (or other user) to rely upon the information provided by a chatbot.
  • Although the chatbot may provide a link to an information page that contains additional information, there is no reason why a customer should know that one part of the site is accurate, and another is not.
  • A chatbot effectively acts as an agent of the company which operates it. It really is not sensible to argue that it has independent agency in its own right.

Cecco, Leyland, Air Canada ordered to pay customer who was misled by airline’s chatbot, The Guardian, 17 February 2024. Available online at https://www.theguardian.com/world/2024/feb/16/air-canada-chatbot-lawsuit.

Moffatt v. Air Canada, 2024 BCCRT 149, CanLII, 14 February 2024. Available online at https://canlii.ca/t/k2spq.


Upcoming Courses


About this Blog

I produce this blog while updating the course notes for our various courses. Links within a story mostly lead to further details in those course notes, and will only be accessible if you are enrolled in the corresponding course. This is a shallow ploy to encourage ongoing study by our students. However, each item ends with a link to the original source.

These blog posts are collected at https://www.lesbell.com.au/blog/index.php?user=3. If you would prefer an RSS feed for your reader, the feed can be found at https://www.lesbell.com.au/rss/file.php/1/dd977d83ae51998b0b79799c822ac0a1/blog/user/3/rss.xml.

Creative Commons License TLP:CLEAR Copyright to linked articles is held by their individual authors or publishers. Our commentary is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License and is labeled TLP:CLEAR.