From:                                                       Jack Enman-Beech <jenmanbeech@gmail.com>

Sent:                                                         Tuesday 20 February 2024 15:59

To:                                                            Matthew Hoyle

Cc:                                                             Kelvin F.K. Low; Steve Hedley; obligations@uwo.ca

Subject:                                                   Re: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

Question for the group: given that Air Canada seeks to disclaim any responsibility for the chat bot's representations, could this be fraud? Air Canada seems to imply non-belief in the truth of the chat bot's statements rather than a carelessness with respect to them.

In Canada airlines are federally regulated. There are rules promulgated by the Canadian Transport Agency that I will not go into on the assumption it would exceed the interest of this discussion group. But Air Canada's case seems weak even on the basis of general (Canadian common law) contract. The relevant contracts are probably the website's terms of use (K1, available here) and its Domestic Tariff that applies to carriage sold (K2, available here). In brief, Moffat may have argued that, on the basis of the misrepresentation made in the course of K1, they were induced to enter into K2 without making a prior application for bereavement fares, leading to the loss of the difference in fares. They might also argue that the fine print in K2 that conflicts with the express representation of the chat bot as to the terms (ie rule 110D of K2) is not incorporated (following Tilden Rent-A-Car v Clendenning).

 

Yours truly &c.,

Jack

 

On Tue, Feb 20, 2024 at 12:26PM Matthew Hoyle <MHoyle@oeclaw.co.uk> wrote:

This must be right – I don’t see any difference between a ‘chatbot’ feature on a website which gives incorrect answers, and an ‘FAQ’ or other information page which displays incorrect information due to a software error.

 

Even if the chatbot is a separate legal person (which it is not, as far as I am aware), how can Air Canada possibly not be liable for the representations? By placing it on its website, it gives that ‘person’ apparent authority to answer customer inquiries, just as if there was a ‘live chat’ function through which I spoke to an agent via text.

 

It’s not an answer to Kelvin given such laws are idiosyncratic, but in English law there would be at least two lines of attach on such a term:

  • It is inconsistent with the private rights under Part 4A Consumer Protection from Unfair Trading Regulations 2008, Part 2 of which prohibit ‘unfair omissions’ and Reg. 27J of which gives a right to damages where the misleading omissions is not made with due diligence.
  • The term would likely be unfair under s.62 Consumer Rights Act 2015, given the impunity it would give to the trader to tell untruths to consumers.
  • The representation would automatically be a term of the contract under s.50 Consumer Rights Act 2015, and s.57(2) prohibits contracting out of s.50 in all circumstances.

 

Matthew Hoyle

Barrister

One Essex Court

 

This message is confidential and may be privileged. If you believe you have received it in error please delete it immediately and inform the sender immediately.

 

Regulated by the Bar Standards Board.

 

From: Kelvin F.K. Low <kelvin.low@gmail.com>
Sent: Monday, February 19, 2024 11:39 PM
To: Steve Hedley <S.Hedley@ucc.ie>; obligations@uwo.ca
Subject: Re: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

The two key passages (for me at any rate) are paragraphs 27 and 31.

 

"27.   Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."

 

This must be correct. You can use whatever technology you wish to make representations (deterministic or stochastic) but expect to be liable if those turn out to be misrepresentations. 

 

"31. To the extent Air Canada argues it is not liable due to certain terms or conditions of its tariff, I note it did not provide a copy of the relevant portion of the tariff. It only included submissions about what the tariff allegedly says. Air Canada is a sophisticated litigant that should know it is not enough in a legal process to assert that a contract says something without actually providing the contract. The CRT also tells all parties are told to provide all relevant evidence. I find that if Air Canada wanted to a raise a contractual defense, it needed to provide the relevant portions of the contract. It did not, so it has not proven a contractual defence."

 

Does anyone know what the alleged conditions are? An entire agreements clause perhaps? Would it survive scrutiny in Canada against a consumer, which Moffatt is?

 

Cheers, 

 

Kelvin 

 


From: Steve Hedley <S.Hedley@ucc.ie>
Sent: Monday, February 19, 2024 11:52:45 PM
To: obligations@uwo.ca <obligations@uwo.ca>
Subject: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

The CyberProf list is currently going full tilt at this one, unsurprisingly.  The ODG will I imagine be more restrained, as it is a low-level decision and the problem is only rather superficially analyzed.  However, I imagine this will be the first of many similar instances, and so may be worth discussion.

 

Moffatt v. Air Canada, 2024 BCCRT 149

https://www.canlii.org/en/bc/bccrt/doc/2024/2024bccrt149/2024bccrt149.html

 

“2.      In November 2022, following the death of their grandmother, Jake Moffatt booked a flight with Air Canada. While researching flights, Mr. Moffat used a chatbot on Air Canada’s website. The chatbot suggested Mr. Moffatt could apply for bereavement fares retroactively. Mr. Moffatt later learned from Air Canada employees that Air Canada did not permit retroactive applications.

 

“3.      Mr. Moffatt says Air Canada must provide them with a partial refund of the ticket price, as they relied upon the chatbot’s advice. They claim $880 for what they say is the difference in price between the regular and alleged bereavement fares.”

 

Liability was found, on the basis of a negligent misrepresentation for which Air Canada were responsible.

 

 

 

Steve Hedley

9thlevel.ie

s.hedley@ucc.ie

private-law-theory.org

 

 

 

Disclaimer

The information contained in this communication from the sender is confidential. It is intended solely for use by the recipient and others authorized to receive it. If you are not the recipient, you are hereby notified that any disclosure, copying, distribution or taking action in relation of the contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been automatically archived by Mimecast, a leader in email security and cyber resilience. Mimecast integrates email defenses with brand protection, security awareness training, web security, compliance and other essential capabilities. Mimecast helps protect large and small organizations from malicious activity, human error and technology failure; and to lead the movement toward building a more resilient world. To find out more, visit our website.