Air Canada ordered to pay customer who was misled by airline’s chatbot

Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare

Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.

Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.

Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.

Gork,

“separate legal entity”

That the airline completely controls and has since updated the chatbot’s programming.

Nurse_Robot,

“we’re not responsible for the actions of our employees, even when we literally programmed them”

What a fucking joke, it’s like they’re saying the quiet part out loud

LufyCZ,

Well it’s true, to a certain extent.

If an employee (or a chatbot, for that matter), promised an egregious sum for no reason, I don’t think the company should be liable either.

Imagine getting hired to do support, having a friend open a chat and you promising to give him a milion dollars. Makes no sense.

But getting mislead about ticket pricing and them then refusing to refund the fare at least partially (the part that they promised would not be charged) is absolutely something they should be liable for.

And lawyer fees plus some pocket money for wasting peoples’ time, if getting a refund entails more than an email or two.

hitmyspot,

Yes, it was reasonable to believe was the point. I think what’s also interesting is the bot referred them to the correct information, which was part of the defence. However, the ruling said that both were provided by the company, the customer had no reason to believe the website gave more accurate information in one place compared to another.

Deestan,

It sounds like they are leaning on the chatbot being recognized as a sentient being. That’s a pretty long shot.

SlopppyEngineer,

In that case the chat bot is an employee and must receive a wage or this is considered slavery.

SonicBlue03,

This chatbot no longer works for the airline according to its LinkedIn profile.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • world@lemmy.world
  • fightinggames
  • All magazines