Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Other Discussions
  3. The Back Room
  4. Robotic age poses ethical dilemma

Robotic age poses ethical dilemma

Scheduled Pinned Locked Moved The Back Room
questionannouncement
70 Posts 25 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L Lost User

    Didn't expect a reply from yourself at this time - must be around 4am in Australia or are you elsewhere? Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts. The stuff you see in Star Trek has a habit of happening. For instance, the hand-held communications device used by Captain Kirk etc is here, you probably got one in your back pocket. So a new or updated definition will no doubt be required.

    C Offline
    C Offline
    Colin Angus Mackay
    wrote on last edited by
    #21

    Richard A. Abbott wrote:

    Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts

    For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"


    Upcoming events: * Edinburgh: Web Security Conference Day for Windows Developers (12th April) * Glasgow: AJAX, SQL Server, Mock Objects My: Website | Blog | Photos

    I 1 Reply Last reply
    0
    • L Lost User

      An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

      S Offline
      S Offline
      Shog9 0
      wrote on last edited by
      #22

      Speak roughly to your robot toy and beat it when it crashes it only does it to annoy and to subtly bring about an attitude of complacency and bemusement, such that we'll be caught off-guard when The Robot Revolution begins...

      ----

      It appears that everybody is under the impression that I approve of the documentation. You probably also blame Ken Burns for supporting slavery.

      --Raymond Chen on MSDN

      1 Reply Last reply
      0
      • 7 73Zeppelin

        You reply in jest, but what, for example would be your opinion if we were to discover life in another form on another planet? Do we allow them a code of ethics, or do we make them (as you suggest) "robotic butt-wipers"?


        Come and see the violence inherent in the system! Help! Help! I'm being repressed!

        R Offline
        R Offline
        Red Stateler
        wrote on last edited by
        #23

        The Apocalyptic Teacup wrote:

        Do we allow them a code of ethics, or do we make them (as you suggest) "robotic butt-wipers"?

        What if, in their culture, wiping butts is actually a great honor? Does this also mean that we need to treat animals as equals? More importantly, what do we make of the mechanical cotton gin? Isn't it degrading to make the gin do work formerly done by slaves without pay? What form of payment would a cotton gin accept?

        1 Reply Last reply
        0
        • R Red Stateler

          Richard A. Abbott wrote:

          Would it or should it have the right to defend itself

          Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?

          B Offline
          B Offline
          Bassam Abdul Baki
          wrote on last edited by
          #24

          Red Stateler wrote:

          What happens if somebody makes a robotic butt-wiper.

          I never knew what the three shells were about in Demolition Man. The suspense is killing me. :)


          "There are II kinds of people in the world, those who understand binary and those who understand Roman numerals." - Bassam Abdul-Baki Web - Blog - RSS - Math - LinkedIn - BM

          1 Reply Last reply
          0
          • L Lost User

            An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

            M Offline
            M Offline
            Mark Salsbery
            wrote on last edited by
            #25

            Ummm...yeah... :rolleyes: When is this "robotic age"? "It is being put together by a five member team of experts that includes futurists and a science fiction writer." "A recent government report forecast that robots would routinely carry out surgery by 2018." Once again, after 10 years, "Intellisense" still doesn't work consistently. I'd rather perform the surgery on myself, thanks. I have the right to bitch-slap my Roomba any time I want!

            "Great job, team. Head back to base for debriefing and cocktails." (Spottswoode "Team America")

            1 Reply Last reply
            0
            • C Colin Angus Mackay

              Richard A. Abbott wrote:

              Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts

              For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"


              Upcoming events: * Edinburgh: Web Security Conference Day for Windows Developers (12th April) * Glasgow: AJAX, SQL Server, Mock Objects My: Website | Blog | Photos

              I Offline
              I Offline
              Ilion
              wrote on last edited by
              #26

              Colin Angus Mackay wrote:

              For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"

              Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

              R C 2 Replies Last reply
              0
              • R Red Stateler

                Richard A. Abbott wrote:

                Would it or should it have the right to defend itself

                Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?

                I Offline
                I Offline
                Ilion
                wrote on last edited by
                #27

                Red Stateler wrote:

                What happens if somebody makes a robotic butt-wiper.

                Don't the Japanese already manufacture toilets that do essentially that? Perhaps that perception of degradation is what's behind this new push for "machine rights."

                1 Reply Last reply
                0
                • D Dan Neely

                  I wonder how much of the chattering masses will start howling if Kim Ding Dong Illness sends his hordes streaming south and the air forces start dropping smart bombs all over them.

                  -- Rules of thumb should not be taken for the whole hand.

                  I Offline
                  I Offline
                  Ilion
                  wrote on last edited by
                  #28

                  dan neely wrote:

                  ... and the air forces start dropping smart bombs all over them.

                  They'd finally have found an instance of "suicide bombing" they could object to.

                  1 Reply Last reply
                  0
                  • L led mike

                    Is your point that smart bombs are robots and therefore have rights? I disagree, that's why we need Jeffersonian principles running the US. The founders never intended a bunch of morally deficient liberals elevate robots into citizenry. Jefferson never considered robots human equals... they were just for doinking.

                    led mike

                    D Offline
                    D Offline
                    David Wulff
                    wrote on last edited by
                    #29

                    led mike wrote:

                    they were just for doinking.

                    I completely misread that. :-O


                    Ðavid Wulff What kind of music should programmers listen to?
                    Join the Code Project Last.fm group | dwulff
                    I'm so gangsta I eat cereal without the milk

                    1 Reply Last reply
                    0
                    • N Nish Nishant

                      Wjousts wrote:

                      I'd believe a robot is intelligent when it realizes that and demands we stop calling it "robot". I'd propose "electro-mechanical American"!

                      Going by the current technology spread across countries, most robots would be Japanese. You may have some of them immigrating to the US - so you could have Electro-Mechanical Japanese-Americans I guess :-)

                      Regards, Nish


                      Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
                      Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)

                      S Offline
                      S Offline
                      starcraft4ever
                      wrote on last edited by
                      #30

                      If the are immigrant robots, then Will they need H1B visa to work in North America?. What about illegal immigrant robots? Will they be deported to Japan? Will they work for minimal wage? I bet when robots work on McDonald, the hamburgers won't taste the same. I don’t think INS will be happy about it. :laugh:

                      1 Reply Last reply
                      0
                      • L Lost User

                        An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

                        L Offline
                        L Offline
                        Lost User
                        wrote on last edited by
                        #31

                        Protection against VB?

                        The tigress is here :-D

                        I 1 Reply Last reply
                        0
                        • W Wjousts

                          I seem to remember that the word robot is from the Czech for slave. I'd believe a robot is intelligent when it realizes that and demands we stop calling it "robot". I'd propose "electro-mechanical American"! ;)

                          L Offline
                          L Offline
                          Lost User
                          wrote on last edited by
                          #32

                          It was "worker" not "slave" to be picky.

                          The tigress is here :-D

                          I 1 Reply Last reply
                          0
                          • L Lost User

                            Protection against VB?

                            The tigress is here :-D

                            I Offline
                            I Offline
                            Ilion
                            wrote on last edited by
                            #33

                            Trollslayer wrote:

                            Protection against VB?

                            Only if you keep (and always use) the shipping crate.

                            1 Reply Last reply
                            0
                            • L led mike

                              Is your point that smart bombs are robots and therefore have rights? I disagree, that's why we need Jeffersonian principles running the US. The founders never intended a bunch of morally deficient liberals elevate robots into citizenry. Jefferson never considered robots human equals... they were just for doinking.

                              led mike

                              D Offline
                              D Offline
                              Dan Neely
                              wrote on last edited by
                              #34

                              Was this intended to be a reply to me? I got it in email but don't know where it broke. If so, I think the whole idea absurd and was mocking it.

                              -- Rules of thumb should not be taken for the whole hand.

                              L 1 Reply Last reply
                              0
                              • L Lost User

                                An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?

                                O Offline
                                O Offline
                                oilFactotum
                                wrote on last edited by
                                #35

                                Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

                                T L 2 Replies Last reply
                                0
                                • I Ilion

                                  Colin Angus Mackay wrote:

                                  For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"

                                  Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

                                  R Offline
                                  R Offline
                                  Rhys Gravell
                                  wrote on last edited by
                                  #36

                                  Ilíon wrote:

                                  There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'

                                  Please then explain, scientifically correctly of course since we're being a pedant like yourself, what exactly is the process of 'thinking'?

                                  Rhys A cult is a religion with no political power. Tom Wolfe Behind every argument is someone's ignorance. Louis D. Brandeis

                                  J 1 Reply Last reply
                                  0
                                  • O oilFactotum

                                    Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

                                    T Offline
                                    T Offline
                                    Tim Craig
                                    wrote on last edited by
                                    #37

                                    oilFactotum wrote:

                                    2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

                                    This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.

                                    The evolution of the human genome is too important to be left to chance idiots like CSS.

                                    O 1 Reply Last reply
                                    0
                                    • T Tim Craig

                                      oilFactotum wrote:

                                      2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

                                      This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.

                                      The evolution of the human genome is too important to be left to chance idiots like CSS.

                                      O Offline
                                      O Offline
                                      oilFactotum
                                      wrote on last edited by
                                      #38

                                      Tim Craig wrote:

                                      This law would make robots subservient to human beings always regardless of the robot's capabilities.

                                      Yes, it would. Quite appropriately, I think.

                                      Tim Craig wrote:

                                      At what point do you decide something is intelligent enough to have independent rights?

                                      A manufactured entity? Probably never. How would you test for self awareness? Especially a computer that was programmed to mimic self awareness.

                                      T 1 Reply Last reply
                                      0
                                      • D Dan Neely

                                        Was this intended to be a reply to me? I got it in email but don't know where it broke. If so, I think the whole idea absurd and was mocking it.

                                        -- Rules of thumb should not be taken for the whole hand.

                                        L Offline
                                        L Offline
                                        led mike
                                        wrote on last edited by
                                        #39

                                        dan neely wrote:

                                        Was this intended to be a reply to me?

                                        Yes.

                                        dan neely wrote:

                                        I got it in email but don't know where it broke.

                                        It's a know CP bug

                                        dan neely wrote:

                                        I think the whole idea absurd and was mocking it.

                                        I know... I was just continuing the mocking... and them some :-D

                                        led mike

                                        1 Reply Last reply
                                        0
                                        • O oilFactotum

                                          Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

                                          L Offline
                                          L Offline
                                          led mike
                                          wrote on last edited by
                                          #40

                                          That's where we went wrong [slaps forehead], let's get rid of all the laywers and have Science Fiction Authors craft our Constitution. :-D

                                          led mike

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups