From:                                                       Matthew Hoyle <MHoyle@oeclaw.co.uk>

Sent:                                                         Tuesday 20 February 2024 12:26

To:                                                            'Kelvin F.K. Low'; Steve Hedley; obligations@uwo.ca

Subject:                                                   RE: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

This must be right – I don’t see any difference between a ‘chatbot’ feature on a website which gives incorrect answers, and an ‘FAQ’ or other information page which displays incorrect information due to a software error.

 

Even if the chatbot is a separate legal person (which it is not, as far as I am aware), how can Air Canada possibly not be liable for the representations? By placing it on its website, it gives that ‘person’ apparent authority to answer customer inquiries, just as if there was a ‘live chat’ function through which I spoke to an agent via text.

 

It’s not an answer to Kelvin given such laws are idiosyncratic, but in English law there would be at least two lines of attach on such a term:

 

Matthew Hoyle

Barrister

One Essex Court

 

This message is confidential and may be privileged. If you believe you have received it in error please delete it immediately and inform the sender immediately.

 

Regulated by the Bar Standards Board.

 

From: Kelvin F.K. Low <kelvin.low@gmail.com>
Sent: Monday, February 19, 2024 11:39 PM
To: Steve Hedley <S.Hedley@ucc.ie>; obligations@uwo.ca
Subject: Re: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

The two key passages (for me at any rate) are paragraphs 27 and 31.

 

"27.   Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."

 

This must be correct. You can use whatever technology you wish to make representations (deterministic or stochastic) but expect to be liable if those turn out to be misrepresentations. 

 

"31. To the extent Air Canada argues it is not liable due to certain terms or conditions of its tariff, I note it did not provide a copy of the relevant portion of the tariff. It only included submissions about what the tariff allegedly says. Air Canada is a sophisticated litigant that should know it is not enough in a legal process to assert that a contract says something without actually providing the contract. The CRT also tells all parties are told to provide all relevant evidence. I find that if Air Canada wanted to a raise a contractual defense, it needed to provide the relevant portions of the contract. It did not, so it has not proven a contractual defence."

 

Does anyone know what the alleged conditions are? An entire agreements clause perhaps? Would it survive scrutiny in Canada against a consumer, which Moffatt is?

 

Cheers, 

 

Kelvin 

 


From: Steve Hedley <S.Hedley@ucc.ie>
Sent: Monday, February 19, 2024 11:52:45 PM
To: obligations@uwo.ca <obligations@uwo.ca>
Subject: Moffatt v. Air Canada, 2024 BCCRT 149 - negligent misrep by a chatbot

 

The CyberProf list is currently going full tilt at this one, unsurprisingly.  The ODG will I imagine be more restrained, as it is a low-level decision and the problem is only rather superficially analyzed.  However, I imagine this will be the first of many similar instances, and so may be worth discussion.

 

Moffatt v. Air Canada, 2024 BCCRT 149

https://www.canlii.org/en/bc/bccrt/doc/2024/2024bccrt149/2024bccrt149.html

 

“2.      In November 2022, following the death of their grandmother, Jake Moffatt booked a flight with Air Canada. While researching flights, Mr. Moffat used a chatbot on Air Canada’s website. The chatbot suggested Mr. Moffatt could apply for bereavement fares retroactively. Mr. Moffatt later learned from Air Canada employees that Air Canada did not permit retroactive applications.

 

“3.      Mr. Moffatt says Air Canada must provide them with a partial refund of the ticket price, as they relied upon the chatbot’s advice. They claim $880 for what they say is the difference in price between the regular and alleged bereavement fares.”

 

Liability was found, on the basis of a negligent misrepresentation for which Air Canada were responsible.

 

 

 

Steve Hedley

9thlevel.ie

s.hedley@ucc.ie

private-law-theory.org

 

 

 

Disclaimer

The information contained in this communication from the sender is confidential. It is intended solely for use by the recipient and others authorized to receive it. If you are not the recipient, you are hereby notified that any disclosure, copying, distribution or taking action in relation of the contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been automatically archived by Mimecast, a leader in email security and cyber resilience. Mimecast integrates email defenses with brand protection, security awareness training, web security, compliance and other essential capabilities. Mimecast helps protect large and small organizations from malicious activity, human error and technology failure; and to lead the movement toward building a more resilient world. To find out more, visit our website.