A little hope ;) 'Air Canada Has to Honor a Refund Policy Its Chatbot Made Up'
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
They will probably start passing laws creating legal isolation between companies and their respective chatbots.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst "I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
-
They will probably start passing laws creating legal isolation between companies and their respective chatbots.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst "I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
Sad, but more probable... :sigh:
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
I really hope it happened more often, but I think they will invent a new legal trick to avoid accontability as suggested by jeron1
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
I really hope it happened more often, but I think they will invent a new legal trick to avoid accontability as suggested by jeron1
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
Or simply tune chatbots to *never* talk about any refunds.
Latest CodeProject post: Quick look into Machine Learning workflow How to solve Word Ladder Problem? To read all my blog posts, visit: Learn by Insight...
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
Reminds me of [BOFH: Looks like you're writing in sick. Are you a big liar? • The Register](https://www.theregister.com/2024/01/26/bofh\_2024\_episode\_2/)
Wrong is evil and must be defeated. - Jeff Ello
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
lol... From that link ... "Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”." Maybe they can team up with the guy that attempted lawsuits in both the US and the UK in an attempt to prove that an AI was sentient. But of course then seems that Air Canada might want to talk to legal about whether that AI is now a slave. Since they are not paying it, it works round the clock and it cannot do anything else.
-
lol... From that link ... "Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”." Maybe they can team up with the guy that attempted lawsuits in both the US and the UK in an attempt to prove that an AI was sentient. But of course then seems that Air Canada might want to talk to legal about whether that AI is now a slave. Since they are not paying it, it works round the clock and it cannot do anything else.
That's the most ridiculous excuse I have ever read. The AI chatbot was literally run by the company. If it said that the company is giving me full refund, the company has to give me full refund as promised, no other way. If they are unhappy, they are free to fix their chatbot. But what said has been said.
-
Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)
-
That's the most ridiculous excuse I have ever read. The AI chatbot was literally run by the company. If it said that the company is giving me full refund, the company has to give me full refund as promised, no other way. If they are unhappy, they are free to fix their chatbot. But what said has been said.
Just thought of a follow on ... If an employee (human) said that then I believe it is settled contract law that the company is liable. That is why timeshare (and other) contracts say that absolutely nothing that the sales person said applies. But what about the sentient (non-human) employee?