Thread: January 5
View Single Post
  #9  
Old 01-06-2005, 07:27 PM
evay's Avatar
evay evay is offline
But if you put the hammer in an elevator...
Member
 
Join Date: Apr 2004
Location: Deck Four, Section Seven
Posts: 522
Default

Quote:
Originally Posted by MaverickZer0
It would be next to impossible to program those laws into a truly sentient AI. If, through whatever fluke, a robot could actually think, they could simply choose not to follow the laws.
Of course, then the debate over AI/robot souls would start,
Which is the reason I said that giving an AI a soul/conscience/superego is the best way to keep it/them from turning on us. About.com's Julia Houston once wrote: "It's a simple Trek truth: create something sentient and it will do what it wants, not what you want." As human beings, we are all sentient, and we have the ability to do what we want. The reason we don't all embark on murderous rampages hourly is that we've been taught that it's bad -- we've learned empathy, we've learned conscience.

I posit that to prevent, for example, V.I.K.I., the AI from I, Robot which decided Robot was superior to Man, we would have to rear the AIs as though we were rearing children -- love them, teach them, guide them, discipline them. Sociopaths have no empathy. That's why serial killers are usually sociopaths. They have no concept of the emotions of others, and don't care. I think our best hope to keep AIs from becoming sociopathic by definition is to teach them empathy. Make them more human, in other words.
__________________
Any truth is better than indefinite doubt. — Sherlock Holmes
"The Adventure of the Yellow Face," Arthur Conan Doyle
Reply With Quote