Robotic age poses ethical dilemma
-
Richard A. Abbott wrote:
Would it or should it have the right to defend itself
Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?
You reply in jest, but what, for example would be your opinion if we were to discover life in another form on another planet? Do we allow them a code of ethics, or do we make them (as you suggest) "robotic butt-wipers"?
Come and see the violence inherent in the system! Help! Help! I'm being repressed!
-
Didn't expect a reply from yourself at this time - must be around 4am in Australia or are you elsewhere? Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts. The stuff you see in Star Trek has a habit of happening. For instance, the hand-held communications device used by Captain Kirk etc is here, you probably got one in your back pocket. So a new or updated definition will no doubt be required.
Richard A. Abbott wrote:
Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts
For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"
Upcoming events: * Edinburgh: Web Security Conference Day for Windows Developers (12th April) * Glasgow: AJAX, SQL Server, Mock Objects My: Website | Blog | Photos
-
An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?
Speak roughly to your robot toy and beat it when it crashes it only does it to annoy and to subtly bring about an attitude of complacency and bemusement, such that we'll be caught off-guard when The Robot Revolution begins...
----
It appears that everybody is under the impression that I approve of the documentation. You probably also blame Ken Burns for supporting slavery.
--Raymond Chen on MSDN
-
You reply in jest, but what, for example would be your opinion if we were to discover life in another form on another planet? Do we allow them a code of ethics, or do we make them (as you suggest) "robotic butt-wipers"?
Come and see the violence inherent in the system! Help! Help! I'm being repressed!
The Apocalyptic Teacup wrote:
Do we allow them a code of ethics, or do we make them (as you suggest) "robotic butt-wipers"?
What if, in their culture, wiping butts is actually a great honor? Does this also mean that we need to treat animals as equals? More importantly, what do we make of the mechanical cotton gin? Isn't it degrading to make the gin do work formerly done by slaves without pay? What form of payment would a cotton gin accept?
-
Richard A. Abbott wrote:
Would it or should it have the right to defend itself
Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?
Red Stateler wrote:
What happens if somebody makes a robotic butt-wiper.
I never knew what the three shells were about in Demolition Man. The suspense is killing me. :)
"There are II kinds of people in the world, those who understand binary and those who understand Roman numerals." - Bassam Abdul-Baki Web - Blog - RSS - Math - LinkedIn - BM
-
An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?
Ummm...yeah... :rolleyes: When is this "robotic age"? "It is being put together by a five member team of experts that includes futurists and a science fiction writer." "A recent government report forecast that robots would routinely carry out surgery by 2018." Once again, after 10 years, "Intellisense" still doesn't work consistently. I'd rather perform the surgery on myself, thanks. I have the right to bitch-slap my Roomba any time I want!
"Great job, team. Head back to base for debriefing and cocktails." (Spottswoode "Team America")
-
Richard A. Abbott wrote:
Although Data of Start Trek 2nd Generation is robot, Data has free will of sorts
For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"
Upcoming events: * Edinburgh: Web Security Conference Day for Windows Developers (12th April) * Glasgow: AJAX, SQL Server, Mock Objects My: Website | Blog | Photos
Colin Angus Mackay wrote:
For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"
Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'
-
Richard A. Abbott wrote:
Would it or should it have the right to defend itself
Punching bags are meant to be punched (not punch back) for training. What happens if somebody makes a robotic butt-wiper. Would that be considered degrading?
-
I wonder how much of the chattering masses will start howling if Kim Ding Dong Illness sends his hordes streaming south and the air forces start dropping smart bombs all over them.
-- Rules of thumb should not be taken for the whole hand.
-
Is your point that smart bombs are robots and therefore have rights? I disagree, that's why we need Jeffersonian principles running the US. The founders never intended a bunch of morally deficient liberals elevate robots into citizenry. Jefferson never considered robots human equals... they were just for doinking.
led mike
led mike wrote:
they were just for doinking.
I completely misread that. :-O
Ðavid Wulff What kind of music should programmers listen to?
Join the Code Project Last.fm group | dwulff
I'm so gangsta I eat cereal without the milk -
Wjousts wrote:
I'd believe a robot is intelligent when it realizes that and demands we stop calling it "robot". I'd propose "electro-mechanical American"!
Going by the current technology spread across countries, most robots would be Japanese. You may have some of them immigrating to the US - so you could have Electro-Mechanical Japanese-Americans I guess :-)
Regards, Nish
Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)If the are immigrant robots, then Will they need H1B visa to work in North America?. What about illegal immigrant robots? Will they be deported to Japan? Will they work for minimal wage? I bet when robots work on McDonald, the hamburgers won't taste the same. I don’t think INS will be happy about it. :laugh:
-
An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?
Protection against VB?
-
I seem to remember that the word robot is from the Czech for slave. I'd believe a robot is intelligent when it realizes that and demands we stop calling it "robot". I'd propose "electro-mechanical American"! ;)
It was "worker" not "slave" to be picky.
-
Protection against VB?
-
Is your point that smart bombs are robots and therefore have rights? I disagree, that's why we need Jeffersonian principles running the US. The founders never intended a bunch of morally deficient liberals elevate robots into citizenry. Jefferson never considered robots human equals... they were just for doinking.
led mike
-
An ethical code (Robot Ethics Charter) to prevent humans abusing robots, and vice versa, is being drawn up by South Korea. [^] What would CP members like to see in this Charter? What legal rights should robots have?
Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
-
Colin Angus Mackay wrote:
For example, he has likes and dislikes. He dislikes being called a "robot". He likes being called an "android"
Correction: For example, he has "likes" and "dislikes." He "dislikes" being called a 'robot.' He "likes" being called an 'android.' "Data" is a human being pretending to be a machine-that-can-think. There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'
Ilíon wrote:
There will never actually be a machine-that-can-think, because 'computation' is not 'thinking.'
Please then explain, scientifically correctly of course since we're being a pedant like yourself, what exactly is the process of 'thinking'?
Rhys A cult is a religion with no political power. Tom Wolfe Behind every argument is someone's ignorance. Louis D. Brandeis
-
Isaac Azimov's 3 Laws of Robotics http://www.auburn.edu/~vestmon/robotics.html[^] 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
oilFactotum wrote:
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.
The evolution of the human genome is too important to be left to chance idiots like CSS.
-
oilFactotum wrote:
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
This law would make robots subservient to human beings always regardless of the robot's capabilities. At what point do you decide something is intelligent enough to have independent rights? Is intelligence enough? What about being self aware? Of course, we don't really have a good definition for intelligence applied to nonhumans, in my opinion. And while there are tests that show self awareness I don't think that failing them proves that the entity isn't self aware. It just doesn't display something we recognize as obviously self awareness.
The evolution of the human genome is too important to be left to chance idiots like CSS.
Tim Craig wrote:
This law would make robots subservient to human beings always regardless of the robot's capabilities.
Yes, it would. Quite appropriately, I think.
Tim Craig wrote:
At what point do you decide something is intelligent enough to have independent rights?
A manufactured entity? Probably never. How would you test for self awareness? Especially a computer that was programmed to mimic self awareness.
-
Was this intended to be a reply to me? I got it in email but don't know where it broke. If so, I think the whole idea absurd and was mocking it.
-- Rules of thumb should not be taken for the whole hand.
dan neely wrote:
Was this intended to be a reply to me?
Yes.
dan neely wrote:
I got it in email but don't know where it broke.
It's a know CP bug
dan neely wrote:
I think the whole idea absurd and was mocking it.
I know... I was just continuing the mocking... and them some :-D
led mike