Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. A little hope ;) 'Air Canada Has to Honor a Refund Policy Its Chatbot Made Up'

A little hope ;) 'Air Canada Has to Honor a Refund Policy Its Chatbot Made Up'

Scheduled Pinned Locked Moved The Lounge
comadobesales
11 Posts 9 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • 0 Offline
    0 Offline
    0x01AA
    wrote on last edited by
    #1

    Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

    J N J O J 6 Replies Last reply
    0
    • 0 0x01AA

      Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

      J Offline
      J Offline
      jeron1
      wrote on last edited by
      #2

      They will probably start passing laws creating legal isolation between companies and their respective chatbots.

      "the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst "I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle

      N 1 Reply Last reply
      0
      • J jeron1

        They will probably start passing laws creating legal isolation between companies and their respective chatbots.

        "the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst "I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle

        N Offline
        N Offline
        Nelek
        wrote on last edited by
        #3

        Sad, but more probable... :sigh:

        M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

        1 Reply Last reply
        0
        • 0 0x01AA

          Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

          N Offline
          N Offline
          Nelek
          wrote on last edited by
          #4

          I really hope it happened more often, but I think they will invent a new legal trick to avoid accontability as suggested by jeron1

          M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

          S 1 Reply Last reply
          0
          • N Nelek

            I really hope it happened more often, but I think they will invent a new legal trick to avoid accontability as suggested by jeron1

            M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

            S Offline
            S Offline
            Sandeep Mewara
            wrote on last edited by
            #5

            Or simply tune chatbots to *never* talk about any refunds.

            Latest CodeProject post: Quick look into Machine Learning workflow How to solve Word Ladder Problem? To read all my blog posts, visit: Learn by Insight...

            1 Reply Last reply
            0
            • 0 0x01AA

              Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

              J Offline
              J Offline
              Jorgen Andersson
              wrote on last edited by
              #6

              Reminds me of [BOFH: Looks like you're writing in sick. Are you a big liar? • The Register](https://www.theregister.com/2024/01/26/bofh\_2024\_episode\_2/)

              Wrong is evil and must be defeated. - Jeff Ello

              1 Reply Last reply
              0
              • 0 0x01AA

                Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

                O Offline
                O Offline
                obermd
                wrote on last edited by
                #7

                Air Canada actually told the judge that they can't be held to what their customer service reps say.

                1 Reply Last reply
                0
                • 0 0x01AA

                  Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

                  J Offline
                  J Offline
                  jschell
                  wrote on last edited by
                  #8

                  lol... From that link ... "Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”." Maybe they can team up with the guy that attempted lawsuits in both the US and the UK in an attempt to prove that an AI was sentient. But of course then seems that Air Canada might want to talk to legal about whether that AI is now a slave. Since they are not paying it, it works round the clock and it cannot do anything else.

                  D 1 Reply Last reply
                  0
                  • J jschell

                    lol... From that link ... "Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”." Maybe they can team up with the guy that attempted lawsuits in both the US and the UK in an attempt to prove that an AI was sentient. But of course then seems that Air Canada might want to talk to legal about whether that AI is now a slave. Since they are not paying it, it works round the clock and it cannot do anything else.

                    D Offline
                    D Offline
                    Daniel Will
                    wrote on last edited by
                    #9

                    That's the most ridiculous excuse I have ever read. The AI chatbot was literally run by the company. If it said that the company is giving me full refund, the company has to give me full refund as promised, no other way. If they are unhappy, they are free to fix their chatbot. But what said has been said.

                    J 1 Reply Last reply
                    0
                    • 0 0x01AA

                      Air Canada ordered to pay customer who was misled by airline’s chatbot | Canada | The Guardian[^] If this becomes common legal practice, many companies will probably reconsider running a ChatBot as a customer service :thumbsup: :)

                      C Offline
                      C Offline
                      Cpichols
                      wrote on last edited by
                      #10

                      :dance:

                      1 Reply Last reply
                      0
                      • D Daniel Will

                        That's the most ridiculous excuse I have ever read. The AI chatbot was literally run by the company. If it said that the company is giving me full refund, the company has to give me full refund as promised, no other way. If they are unhappy, they are free to fix their chatbot. But what said has been said.

                        J Offline
                        J Offline
                        jschell
                        wrote on last edited by
                        #11

                        Just thought of a follow on ... If an employee (human) said that then I believe it is settled contract law that the company is liable. That is why timeshare (and other) contracts say that absolutely nothing that the sales person said applies. But what about the sentient (non-human) employee?

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups