Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Other Discussions
  3. The Back Room
  4. Robotic age poses ethical dilemma

Robotic age poses ethical dilemma

Scheduled Pinned Locked Moved The Back Room
questionannouncement
70 Posts 25 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L led mike

    Is your point that smart bombs are robots and therefore have rights? I disagree, that's why we need Jeffersonian principles running the US. The founders never intended a bunch of morally deficient liberals elevate robots into citizenry. Jefferson never considered robots human equals... they were just for doinking.

    led mike

    D Offline
    D Offline
    Dan Neely
    wrote on last edited by
    #34

    Was this intended to be a reply to me? I got it in email but don't know where it broke. If so, I think the whole idea absurd and was mocking it.

    -- Rules of thumb should not be taken for the whole hand.

    L 1 Reply Last reply
    0
    • L Lost User

      An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

      O Offline
      O Offline
      oilFactotum
      wrote on last edited by
      #35

      Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

      T L 2 Replies Last reply
      0
      • I Ilion

        Colin Angus Mackay wrote:

        For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"

        Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

        R Offline
        R Offline
        Rhys Gravell
        wrote on last edited by
        #36

        Ilíon wrote:

        There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

        Please then explain, scientifically correctly of course since we're being a pedant like yourself, what exactly is the process of 'thinking'?

        Rhys A cult is a religion with no political power. Tom Wolfe Behind every argument is someone's ignorance. Louis D. Brandeis

        J 1 Reply Last reply
        0
        • O oilFactotum

          Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

          T Offline
          T Offline
          Tim Craig
          wrote on last edited by
          #37

          oilFactotum wrote:

          2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

          This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.

          The evolution of the human genome is too important to be left to chance idiots like CSS.

          O 1 Reply Last reply
          0
          • T Tim Craig

            oilFactotum wrote:

            2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

            This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.

            The evolution of the human genome is too important to be left to chance idiots like CSS.

            O Offline
            O Offline
            oilFactotum
            wrote on last edited by
            #38

            Tim Craig wrote:

            This law would make robots subservient to human beings always regardless of the robot's capabilities.

            Yes, it would. Quite appropriately, I think.

            Tim Craig wrote:

            At what point do you decide something is intelligent enough to have independent rights?

            A manufactured entity? Probably never. How would you test for self awareness? Especially a computer that was programmed to mimic self awareness.

            T 1 Reply Last reply
            0
            • D Dan Neely

              Was this intended to be a reply to me? I got it in email but don't know where it broke. If so, I think the whole idea absurd and was mocking it.

              -- Rules of thumb should not be taken for the whole hand.

              L Offline
              L Offline
              led mike
              wrote on last edited by
              #39

              dan neely wrote:

              Was this intended to be a reply to me?

              Yes.

              dan neely wrote:

              I got it in email but don't know where it broke.

              It's a know CP bug

              dan neely wrote:

              I think the whole idea absurd and was mocking it.

              I know... I was just continuing the mocking... and them some :-D

              led mike

              1 Reply Last reply
              0
              • O oilFactotum

                Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

                L Offline
                L Offline
                led mike
                wrote on last edited by
                #40

                That's where we went wrong [slaps forehead], let's get rid of all the laywers and have Science Fiction Authors craft our Constitution. :-D

                led mike

                1 Reply Last reply
                0
                • I Ilion

                  Colin Angus Mackay wrote:

                  For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"

                  Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

                  C Offline
                  C Offline
                  Christian Graus
                  wrote on last edited by
                  #41

                  Perhaps the uncomfortable question is, if it's indistinguishable from 'thinking', then how do we know that there IS a difference ?

                  Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog "I am working on a project that will convert a FORTRAN code to corresponding C++ code.I am not aware of FORTRAN syntax" ( spotted in the C++/CLI forum )

                  I 2 Replies Last reply
                  0
                  • R Rhys Gravell

                    Ilíon wrote:

                    There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

                    Please then explain, scientifically correctly of course since we're being a pedant like yourself, what exactly is the process of 'thinking'?

                    Rhys A cult is a religion with no political power. Tom Wolfe Behind every argument is someone's ignorance. Louis D. Brandeis

                    J Offline
                    J Offline
                    Jorgen Sigvardsson
                    wrote on last edited by
                    #42

                    Rhys666 wrote:

                    Please then explain, scientifically correctly of course since we're being a pedant like yourself, what exactly is the process of 'thinking'?

                    Being an asshole on internet forums? :~

                    1 Reply Last reply
                    0
                    • C Christian Graus

                      Perhaps the uncomfortable question is, if it's indistinguishable from 'thinking', then how do we know that there IS a difference ?

                      Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog "I am working on a project that will convert a FORTRAN code to corresponding C++ code.I am not aware of FORTRAN syntax" ( spotted in the C++/CLI forum )

                      I Offline
                      I Offline
                      Ilion
                      wrote on last edited by
                      #43

                      Christian Graus wrote:

                      Perhaps the uncomfortable question is, if it's indistinguishable from 'thinking', then how do we know that there IS a difference ?

                      Yur'oe tinhknig of 'cpmotunig' in the retsirtecd and dveriaitve secne of the aviittcy wcihh is dnoe on or wtih mreodn eecotirnlc cpumteors. As I alsmot aylwas do, I was unisg 'cpumiotng' in the mroe bsiac sncee of cniountg and the relus by wchih ctnoiung ootpreains are premfored. But tehn, wehn you get dwon to it, taht is all a merdon ecoteirnlc cpmteour deos, it jsut does teshe tihgns ftsater tahn erailer mcaheniacl ctpumeors did. And, of csoure, tshee erailer mcaeniachl ctpueomrs to wchih I rfeer did nnohtig at all, it was the hmaun mnid dinog evyerhintg whcih was dnoe. And, by the smae tekon, meordn ecotirnlec cpumteors do not raelly do aytinhng; it is aaign a hmaun mnid dniog evyerhintg wichh is dnoe. Deos an etlercic psuh-btuton coalctular 'tihnk?' Deos a manccaihel psuh-btuton coalctular 'tihnk?' Does a sidle-rlue 'tnihk?' Deos a picnel and ppear 'tnhik?' Deos a mahncecail acabus 'tihnk?' If the Cenishe had invented a peowred acubas, wihch ddni't ruriqee a hmaun to mvoe the bttouns, wulod you say taht it 'tihnks?' Of crouse not! So, why do you wnat to inagime taht a mreodn eecotirnlc cptumeor can 'tinhk' or taht smoe hopathietcyl furute cptumeor wlil 'tnihk?' 'Ctpmutaioon' is not 'tkinhing.' Is taht ralely so dciiulfft to garsp?

                      C I 2 Replies Last reply
                      0
                      • L Lost User

                        It was "worker" not "slave" to be picky.

                        The tigress is here :-D

                        I Offline
                        I Offline
                        Ilion
                        wrote on last edited by
                        #44

                        Trollslayer wrote:

                        It was "worker" not "slave" to be picky.

                        Damn! How pedantic can a person get? :cool: Even I passed that one by ;)

                        C 1 Reply Last reply
                        0
                        • O oilFactotum

                          Tim Craig wrote:

                          This law would make robots subservient to human beings always regardless of the robot's capabilities.

                          Yes, it would. Quite appropriately, I think.

                          Tim Craig wrote:

                          At what point do you decide something is intelligent enough to have independent rights?

                          A manufactured entity? Probably never. How would you test for self awareness? Especially a computer that was programmed to mimic self awareness.

                          T Offline
                          T Offline
                          Tim Craig
                          wrote on last edited by
                          #45

                          oilFactotum wrote:

                          A manufactured entity?

                          So "manufactured" is the critical quality? What about "artificial" lifeforms that are more akin the "traditional" wet chemical processes we're used to?

                          oilFactotum wrote:

                          How would you test for self awareness?

                          The only test I currently know of is the famous mirror test. So far the only animals to pass are the great apes and much of that is probably due to their close functional relationship with us so that we can reasonably interpret the nonverbals given off when the light goes on. As I said, I don't think failing that test proves an animal, or anything else, is non self aware. All it says is that the animal has the analytical facilities to recognized the image in the mirror as itself. A creature capable of communicating could convey whether it is aware of itself.

                          oilFactotum wrote:

                          Especially a computer that was programmed to mimic self awareness.

                          When does "mimicry" become as good as the original? Humans wanted to fly and originally tried it by mimicing birds. When we actually flew, it was by another method. It's still flight. And in some aspects, it's surpasses what birds can do. If you're talking about the crude ELISA attempts at mimicing intelligence, those aren't really what I'm trying to discuss.

                          The evolution of the human genome is too important to be left to chance idiots like CSS.

                          O 1 Reply Last reply
                          0
                          • I Ilion

                            Christian Graus wrote:

                            Perhaps the uncomfortable question is, if it's indistinguishable from 'thinking', then how do we know that there IS a difference ?

                            Yur'oe tinhknig of 'cpmotunig' in the retsirtecd and dveriaitve secne of the aviittcy wcihh is dnoe on or wtih mreodn eecotirnlc cpumteors. As I alsmot aylwas do, I was unisg 'cpumiotng' in the mroe bsiac sncee of cniountg and the relus by wchih ctnoiung ootpreains are premfored. But tehn, wehn you get dwon to it, taht is all a merdon ecoteirnlc cpmteour deos, it jsut does teshe tihgns ftsater tahn erailer mcaheniacl ctpumeors did. And, of csoure, tshee erailer mcaeniachl ctpueomrs to wchih I rfeer did nnohtig at all, it was the hmaun mnid dinog evyerhintg whcih was dnoe. And, by the smae tekon, meordn ecotirnlec cpumteors do not raelly do aytinhng; it is aaign a hmaun mnid dniog evyerhintg wichh is dnoe. Deos an etlercic psuh-btuton coalctular 'tihnk?' Deos a manccaihel psuh-btuton coalctular 'tihnk?' Does a sidle-rlue 'tnihk?' Deos a picnel and ppear 'tnhik?' Deos a mahncecail acabus 'tihnk?' If the Cenishe had invented a peowred acubas, wihch ddni't ruriqee a hmaun to mvoe the bttouns, wulod you say taht it 'tihnks?' Of crouse not! So, why do you wnat to inagime taht a mreodn eecotirnlc cptumeor can 'tinhk' or taht smoe hopathietcyl furute cptumeor wlil 'tnihk?' 'Ctpmutaioon' is not 'tkinhing.' Is taht ralely so dciiulfft to garsp?

                            C Offline
                            C Offline
                            Chris Kaiser
                            wrote on last edited by
                            #46

                            A computer may not yet be as good as the human mind in pattern matching, but its rediculous to say that thinking is not computing. All you displayed here is pattern matching, and with the advent of quantum computers, I'd say wait and see.

                            This statement was never false.

                            O 1 Reply Last reply
                            0
                            • I Ilion

                              Trollslayer wrote:

                              It was "worker" not "slave" to be picky.

                              Damn! How pedantic can a person get? :cool: Even I passed that one by ;)

                              C Offline
                              C Offline
                              Chris Kaiser
                              wrote on last edited by
                              #47

                              Rhetorical questions such as this require the mirror to answer. Help yourself.

                              This statement was never false.

                              1 Reply Last reply
                              0
                              • L Lost User

                                An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

                                L Offline
                                L Offline
                                Lost User
                                wrote on last edited by
                                #48

                                Thats outrageous. Until we fully understand the human mind and consciousness we will not be able to create intelligent robots that deserve the same rights as a human. We can create a computer program that can seem intelligent and can seem to think and control a mechanical body of some kind but it is just software running on a computer. If I develop my own robot then I should be able to beat it with a hammer when it makes me mad. This also brings up an important point. What if the robot's software is running on my desktop computer and its controlling a virtual 3d robot body on my screen? I can tell you I wouldn't be going to jail if I deleted it or made it beat up another virtual robot's body. It's just dumb, It is extremely unlikely software and hardware will ever feel and experience the world no matter how smart it may seem.

                                █▒▒▒▒▒██▒█▒██ █▒█████▒▒▒▒▒█ █▒██████▒█▒██ █▒█████▒▒▒▒▒█ █▒▒▒▒▒██▒█▒██

                                I 1 Reply Last reply
                                0
                                • T Tim Craig

                                  oilFactotum wrote:

                                  A manufactured entity?

                                  So "manufactured" is the critical quality? What about "artificial" lifeforms that are more akin the "traditional" wet chemical processes we're used to?

                                  oilFactotum wrote:

                                  How would you test for self awareness?

                                  The only test I currently know of is the famous mirror test. So far the only animals to pass are the great apes and much of that is probably due to their close functional relationship with us so that we can reasonably interpret the nonverbals given off when the light goes on. As I said, I don't think failing that test proves an animal, or anything else, is non self aware. All it says is that the animal has the analytical facilities to recognized the image in the mirror as itself. A creature capable of communicating could convey whether it is aware of itself.

                                  oilFactotum wrote:

                                  Especially a computer that was programmed to mimic self awareness.

                                  When does "mimicry" become as good as the original? Humans wanted to fly and originally tried it by mimicing birds. When we actually flew, it was by another method. It's still flight. And in some aspects, it's surpasses what birds can do. If you're talking about the crude ELISA attempts at mimicing intelligence, those aren't really what I'm trying to discuss.

                                  The evolution of the human genome is too important to be left to chance idiots like CSS.

                                  O Offline
                                  O Offline
                                  oilFactotum
                                  wrote on last edited by
                                  #49

                                  Tim Craig wrote:

                                  So "manufactured" is the critical quality?

                                  Yeah, I would say that is a critical quality. Perhaps if we get our technology to the point of building Star Trek-like Datas or Cylons that would need reconsideration. The ramifications of immortal artificial life forms more intelligent than any human could ever hope to be - how would they not end up ruling humans?

                                  Tim Craig wrote:

                                  What about "artificial" lifeforms that are more akin the "traditional" wet chemical processes we're used to?

                                  I don't know. Mechanical intelligence seems to be an easier task that creating an intelligent "artificial wet chemical" (biological?) lifeform. That would be a completely different issue in my mind. And that could even be more dangerous especially if it were self-replicating.

                                  Tim Craig wrote:

                                  So far the only animals to pass are the great apes

                                  There was an elephant in the news recently that seemed to recognize itself in a mirror.

                                  Tim Craig wrote:

                                  Humans wanted to fly and originally tried it by mimicing birds

                                  Humans learned to fly, but we are still not birds. Mimicing self awareness or intelligence doesn't make it intelligent or self aware.

                                  Tim Craig wrote:

                                  If you're talking about the crude ELISA attempts at mimicing intelligence, those aren't really what I'm trying to discuss.

                                  No, I assume you talking about technologies that are decades away at the very least.

                                  1 Reply Last reply
                                  0
                                  • L Lost User

                                    John, at the moment your reply is correct. But what if these robots are permitted to eventually develop "free-will"?

                                    A Offline
                                    A Offline
                                    Anna Jayne Metcalfe
                                    wrote on last edited by
                                    #50

                                    If a robot (or computer) becomes sentient then everything changes.

                                    Anna :rose: Linting the day away :cool: Anna's Place | Tears and Laughter "If mushy peas are the food of the devil, the stotty cake is the frisbee of God"

                                    1 Reply Last reply
                                    0
                                    • C Chris Kaiser

                                      A computer may not yet be as good as the human mind in pattern matching, but its rediculous to say that thinking is not computing. All you displayed here is pattern matching, and with the advent of quantum computers, I'd say wait and see.

                                      This statement was never false.

                                      O Offline
                                      O Offline
                                      originSH
                                      wrote on last edited by
                                      #51

                                      Pattern matching is coming on leaps and bounds as well. The algorithms are being refined, massive repository's of information are being built up and calculation speed is always rising. To say that machines will never reach the level of complexity and ability of a human is to ignore the pattern of growth your brain should be perceiving :P The debatable part is whether or not a man made brain will ever be able to truly "feel emotion" and have "free will", but then again its debatable whether we ourselves truly have that or are we just responding the way we are programmed to our surroundings like any other animal, just in a more complex way.

                                      I C 2 Replies Last reply
                                      0
                                      • L Lost User

                                        John, at the moment your reply is correct. But what if these robots are permitted to eventually develop "free-will"?

                                        R Offline
                                        R Offline
                                        Rich Leyshon
                                        wrote on last edited by
                                        #52

                                        Ask them to suggest their own charter. Rich:laugh:

                                        1 Reply Last reply
                                        0
                                        • R Red Stateler

                                          Richard A. Abbott wrote:

                                          Would it or should it have the right to defend itself

                                          Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?

                                          H Offline
                                          H Offline
                                          hairy_hats
                                          wrote on last edited by
                                          #53

                                          Red Stateler wrote:

                                          What happens if somebody makes a robotic butt-wiper.

                                          What sort of robotic butt would require wiping? You envisage oil leaks or something?

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups