The Five-Minute Forums  

Go Back   The Five-Minute Forums > FiveMinute.net > Science Fiction
Register FAQ Community Calendar Today's Posts Search

View Poll Results: Asimov's Laws a Requirement? Preferable?
Yes 4 50.00%
No 4 50.00%
Voters: 8. You may not vote on this poll

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #33  
Old 01-29-2007, 07:35 PM
Gatac's Avatar
Gatac Gatac is offline
Man in the iron mask
Member
 
Join Date: Feb 2004
Location: Magdeburg, Germany
Posts: 667
Send a message via ICQ to Gatac Send a message via AIM to Gatac
Default

Quote:
Originally Posted by PointyHairedJedi View Post
Visions of Multivac, Colossus, Shalmanesser and Skynet are ultimately just fantasy, unrealisable because conciousness is not something that can be created, whole and complete, utterly constrained in everything it does by a set of arbitrarily imposed rules. Machines that think, if there ever are any, will be like us - blank slates that must be taught how to think from the ground up.
While I agree that creating consciousness is tricky, I believe it'll be relatively easy to "grow" once you have the hardware to get a decently-sized neural network going. Like I said, self-evolving systems have a lot of potential. Whether we can hardcode any "rules" is debateable, but there are already some suggestions, such as "benelovence", i.e. we must make sure that any intelligence we do end up creating is fundamentally friendly to mankind.

Quote:
Of course, I have my doubts as to whether we will ever manage such a feat, as first we must understand how conciousness works in humans. It's a problem that I don't think will be solved in any of our lifetimes - though we may attain a vastly more complete understanding of the functioning of our brains, that won't tell us much about self-conciousness and free will. We may get machines that can learn how to do a few things, but I doubt there will ever be anything that has the amazing capacity and range of the human mind.
The "ever" is a dangerous thing. Saying something can't be done because you can't do it is shortsighted at best. Also, I know that it is possible to create something intelligent - after all, there's, you know, *us*. Once we understand the chemistry of neurons well enough to emulate one and manage to pack in enough nodes into a system that they are roughly equal to the amount of neurons we have, we should be able to start writing the first primitive self-evolving software and have it undergo rapid evolution towards more complex behaviour.

I think the real question is not whether this is feasible, there are over six billion of us walking around with just that kind of computer in our skull, but the more interesting questions are what kind of intelligence you will create in this way. In a lot of ways, we are who we are because of what came before us - how can we tell what will result from feeding the budding AI on what we *think* is the best way to grow an intelligence? We might be breeding a true alien that remains completely incomprehensible to us - the problem shifts if it becomes intelligent enough to analyze and understand *us*, but then you have the boogieman of an AI that's smarter than us. It will be able to talk to us, but we won't know what's going on inside of it. (AI that we can fully analyze and understand is likely to not be very useful, though...unless you're breeding artificial insects.)

That'll be a few interesting conversations, I think.

Quote:
Clearly you have never seen Mars Attacks!.
Oh, I have, it's one of my favorite movies. However, we can not assume that alien malefactors have any specific anti-nuclear technology and just give up here, because nukes are still the most destructive weapons we have and therefore our best shot. If nukes don't work, we're probably screwed. I fully blame Hollywood for the "The aliens are immune to nuclear weapons" trope - we can't hope to have some deus ex machine weakness in the attackers, therefore our best bet is brute force, and nukes are the best brute force weapons we have, plus we have a lot of them. If the aliens do happen to have a weakness we can exploit (and that seems reasonable; all the portrayals of superior aliens in Sci-Fi have dulled us to the fact that a human is an amazingly tough and resourceful animal, and likely to be far superior in at least one respect over the aliens) - more power to us.

Gatac
__________________
Katy: Can I have the skill 'drive car off bridge and have parachute handy'?
Justin: It's kind of a limited skill.
Greg: Depends on how often you drive off bridges.
- d02 Quotes
Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump


All times are GMT. The time now is 05:02 AM.


Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.