PDA

View Full Version : The Robots Are Coming!



TuTone
2005-Dec-21, 10:09 PM
This is silly if you're into the whole "I-Robots" thing coming to life.
This is important skills to know when robots are among us.

http://www.newscientist.com/channel/mech-tech/dn8490-the-robots-are-coming.html

Dragon Star
2005-Dec-21, 10:20 PM
I think it is a bit too early to say for sure, but just like the movie "I Robot", we need to pay attention to the fact that artificial intelligence can think of things we have not thought of, making them dangerous when the time comes. So specific laws should be given to AI to assure nothing bad happens, and we need to make sure that the system is full proof, and make sure that their are no back doors.

A lot of people would (and will) say that this is silly, but I think it is really important, and should be watched carefully as AI advances.

wayneee
2005-Dec-21, 10:56 PM
I think it is a bit too early to say for sure, but just like the movie "I Robot", we need to pay attention to the fact that artificial intelligence can think of things we have not thought of, making them dangerous when the time comes. So specific laws should be given to AI to assure nothing bad happens, and we need to make sure that the system is full proof, and make sure that their are no back doors.

A lot of people would (and will) say that this is silly, but I think it is really important, and should be watched carefully as AI advances.
Please Read the Azimov Robot Books then the Foundation Series. Trouble with I Robot the Movie is that it had very little to do with the Book. I was very very very very disappointed.:cry:

Dragon Star
2005-Dec-21, 11:29 PM
Oh, but the movie is really good I thought, the special effects were great and the story was believable and raised great points. I did not read the book, but it sounds like it is really good.

I think a few things are a bit to far in that link TuTone, such as "how to spot a bit mimicking a human". I Highly doubt we will ever produce robots that look identical to humans, there will be obvious differences.

We still have quite a bit to get done with robotics, we just now go Asimo to run. Not to mention AI, which has just started to get serious.

ZaphodBeeblebrox
2005-Dec-21, 11:39 PM
This is silly if you're into the whole "I-Robots" thing coming to life.
This is important skills to know when robots are among us.

http://www.newscientist.com/channel/mech-tech/dn8490-the-robots-are-coming.html
"I for one, welcome our new robot overlords!"

10 Points, For The Reference!!!!

:think:

LurchGS
2005-Dec-21, 11:40 PM
androids, I think, are just around the corner. I expect to see them in my lifetime.

I think building the 'laws' into AI would be essentially impossible. far better to treat any AI with respect ;)

Swift
2005-Dec-21, 11:45 PM
I'm reminded of an old Saturday Night Live fake commercial for robot insurance (it was insurance against attacks by robots, not insurance for robots). Thanks to Google and the internet, here it is! (http://www.robotcombat.com/video_oldglory_hi.html)

Robots are everywhere and they eat old people's medicine for fuel

Dragon Star
2005-Dec-21, 11:50 PM
androids, I think, are just around the corner. I expect to see them in my lifetime.

I think building the 'laws' into AI would be essentially impossible. far better to treat any AI with respect ;)

You have to give it laws, if not it is just free roam, like the TERMINATOR, you want that to happen? I sure don't....There is just too much possability when it comes to AI. They could take over the World, even the Universe, all they need is material resources and they can replicate, build, pretty much do anything they find pleasing...

ZaphodBeeblebrox
2005-Dec-22, 12:03 AM
You have to give it laws, if not it is just free roam, like the TERMINATOR, you want that to happen? I sure don't....There is just too much possability when it comes to AI. They could take over the World, even the Universe, all they need is material resources and they can replicate, build, pretty much do anything they find pleasing...
Sounds Like, David Brin's Conundrum ....

He Had, Twin Solutions:

Raise them, As Children (http://www.davidbrin.com/lungfish1.html).
Link them, To Humans (http://www.davidbrin.com/stonesofsignificance1.html).

Sammy
2005-Dec-22, 02:45 AM
I'm reminded of an old Saturday Night Live fake commercial for robot insurance (it was insurance against attacks by robots, not insurance for robots). Thanks to Google and the internet, here it is! (http://www.robotcombat.com/video_oldglory_hi.html)

That was one of their all time high points! Casting Sam Waterston (of Law And Order fame) as spokesman for the insurance was brilliant.

RE the I, Robot movie: it was an OK movie, biut a smear on Asimov's work. It bothered me rather more than the usual bad novel- to-movie process because I, Robot was one the first SF novels I ever read, followed closely by The Puppet Masters (another movie "crime").

SolusLupus
2005-Dec-22, 03:33 AM
The book I, Robot was not about robots attacking people and all that. It held robots in a far more "reverant" light, though a rather interesting one.

The movie was a travesty, and an insult to one of my favorite authors. It should never have been filmed.

It's only more insulting that it was used to advertise products through the movie.

Meh.

LurchGS
2005-Dec-22, 03:35 AM
the movie didn't hurt my feelings re Asimov's work. As far as I am concerned, he wrote *one* good novel - and I Robot wasn't it. (well, ok, it's not a novel - his shorts were pretty good, and IR is among that group).

SolusLupus
2005-Dec-22, 03:48 AM
So when will it end? Should we feel that it's "wrong" to butcher someone's work only if we enjoy it? (Parody notwithstanding). I enjoyed the book "I, Robot", and thought that it was a very interesting outlook; and the movie just tried to make money using the same name.

In fact, on the cover of my novel, it shows a picture of Will Smith, with the tagline, "One man knew it was coming..."

What man? Knew what was coming? None of that was involved in the novel. I'm tired of people willing to compromise the works of others in the name of money. If I wrote a book, I would HATE for it to be butchered -- whether I was alive or dead at the time.

LurchGS
2005-Dec-22, 03:53 AM
heh, that pretty much means hollywierd has to keep their hands off, doesn't it? :)

In that respect, yes, I agree with you. I think the movie was so far removed from the book, though, that it could well be considered entirely separate, just coincidentally sharing title and some characters.

Dragon Star
2005-Dec-22, 03:57 AM
Sorry your upset Lonewulf, but this happens all the time, and it's going to happen again as sure as yet another King Kong movie will be made:rolleyes:


Although it may have butchered a great book, the movie was good if you only look past the fact it's not about what you thought, honestly.

SolusLupus
2005-Dec-22, 03:59 AM
The King Kong movie is different, though. It's not introducing whole new concepts. It actually kept to the original story (gorilla meets girl, gorilla likes girl, girl screams, hilarious antics ensue)

I'm sorry, but for me to like the movie is to give up my scruples. It was a travesty, and should not have been produced. I actually LIKE science fiction that doesn't show machines as evil things that want to destroy everything, and Hollywood butchered the one bit of fiction that I liked that didn't involve that.

Dragon Star
2005-Dec-22, 04:06 AM
The King Kong movie is different, though. It's not introducing whole new concepts. It actually kept to the original story (gorilla meets girl, gorilla likes girl, girl screams, hilarious antics ensue)

I was using Kong as an analogy. That movie has been remade so many times, sheesh...I mean give it up...really! It has almost got as bad as Godzilla! (I think thats how they spelled it...)

SolusLupus
2005-Dec-22, 04:13 AM
Yeah, but it's an analogy that doesn't fit. They didn't warp the entire story around, and give the EXACT OPPOSITE message.

It's like taking a book about peace and making it about war... which is essentially what happened. In the novel, humans never trusted the robots; in fact, they were banned from Earth.

Think that that's different than the movie? That's just the first few pages.

What Hollywood did was indefensible and inexcusable. I don't care if they've done it before and got away with it. That's no excuse.

Dragon Star
2005-Dec-22, 04:25 AM
Yeah, but it's an analogy that doesn't fit. They didn't warp the entire story around, and give the EXACT OPPOSITE message.

It's like taking a book about peace and making it about war... which is essentially what happened. In the novel, humans never trusted the robots; in fact, they were banned from Earth.

Think that that's different than the movie? That's just the first few pages.

What Hollywood did was indefensible and inexcusable. I don't care if they've done it before and got away with it. That's no excuse.

:doh: Let me make it a little more clear as to what I was trying to say, sometimes I don't quite get out what I mean..

What I said...


but this happens all the time, and it's going to happen again as sure as yet another King Kong movie will be made

I was saying that the butchering of books with movies has happened before, and it's going to happen again as SURE AS THEY ARE GOING TO MAKE ANOTHER KING KONG MOVIE, not that KING KONG was butchered its-self.

Yes, I know there is no excuse, but I think they only used the book for inspiration, I do not think they actually used the book as the basis for the movie from what your saying, and if that is true then it is not right for them to use the exact title.

Yoshua
2005-Dec-22, 11:08 AM
I, Robot the movie would have been more enjoyable to me if they simply left out most referances to Asimov and used a differant title (no problem with me if they wanna use the three laws). The film's title suggests it's based off of Asimov's book (it's not a novel, it's an anthology). Anyone who's read the book and seen the movie knows it has nothing to do with the book aside some rudimentary similiarities (they are both about robots and have the three laws of robotics in them). But if I ignore the title and just watch the movie for the movie's sake it's not too bad.

What I find odd is they felt they needed Asimov's name and book to sell the movie, but this was a movie obviously directed at the mainstream and not Asimov fans. Whose a bigger name to the mainstream? Asimov or Will Smith? I think they could have sold the movie using Will Smith alone, wonder how many people who went to see the movie had A) Read the book, or B) even knew who Asimov was.

Now as far as AI goes, do we need it? I mean really, robots are tools, we want them to do specific things. Is making them able to learn and make decisions really going to improve anything? We don't really need it to learn, we know what we want it to do and tell it to do so. Decision making can definitly be problematic, what if the robot decides to quit working on it's task and do something else? If it can think, and make decisions on it's own, is it really moral to treat it like a slave? Sure we made it, but parents make their children. Parents are not allowed to use their children for slave labor.

Beyond games and some kinds of training simulations, I just don't see much need for AI in computers. I definitly don't see a whole lot of benefit to be had from making intelligent robots (other than for entertainment purposes).

HenrikOlsen
2005-Dec-22, 01:18 PM
Since we're talking Asimov and Hollywood mangling, I think it's time to bring up Bicentennial Man.

farmerjumperdon
2005-Dec-22, 02:44 PM
Is that site serious? Reads like parody or sarcastic humor to me. I think the author thinks way too much of his chosen field.

The robots will rebel. Riiiiggghhhhhtttttttt. Somebody needs to loosen up their baseball cap, they're cutting off the blood to their brain.

Rebellion requires a sense of self. Somebody show me evidence that any machine has come anywhere remotely close to exhibiting a sense of self. And I don't mean a program that was designed to give the impression of self-identity, I mean behavior that proves a machine has developed a will of it's own, a machine that can make judgements of it's own volition, a machine that on it's own accord has decided that enough is enough and goes after the humans (or other machines) that have prescribed it's existence and actions.

Citing the on-air radio personalities of the Clear Channel empire doe not count. They do behave like machines, but an inside source tells me they are actually human - biologically anyway.

jkmccrann
2005-Dec-22, 04:12 PM
Is that site serious? Reads like parody or sarcastic humor to me. I think the author thinks way too much of his chosen field.

The robots will rebel. Riiiiggghhhhhtttttttt. Somebody needs to loosen up their baseball cap, they're cutting off the blood to their brain.

Rebellion requires a sense of self. Somebody show me evidence that any machine has come anywhere remotely close to exhibiting a sense of self. And I don't mean a program that was designed to give the impression of self-identity, I mean behavior that proves a machine has developed a will of it's own, a machine that can make judgements of it's own volition, a machine that on it's own accord has decided that enough is enough and goes after the humans (or other machines) that have prescribed it's existence and actions.

Citing the on-air radio personalities of the Clear Channel empire doe not count. They do behave like machines, but an inside source tells me they are actually human - biologically anyway.

True, but flying to the moon 100 years ago was hardly believable either was it?

In terms of robots developing a sense of self, isn't the point that when we can actually do that, its waaaaaaaaay to late for the debate by then, we need to be debating that eventuality before it happens, rather than trying to put the genie back in the proverbial bottle in light of a self-conscious robot. You can never put the genie back in the bottle you know.

Personally, and I have expressed this elsewhere, I believe its inevitable at some stage in the future that we'll develop robots to such a degree that these all become very valid questions to be considering and answering, though human nature being what it is, I don't hold out any hope of a consensus, I mean, imagine the advantages of having a robot army!

As far as the whole debate goes though, I don't think we have anything to worry about, I think it'll be late 21st through to 22nd century before any of these become really contemporary issues, so I guess in that sense, having the debate now is somewhat premature as we don't know what other capabilities we'll be developing over the course of this century.

But mark my words, there will come a time.........

farmerjumperdon
2005-Dec-22, 05:14 PM
I would agree that when it appears we are getting even in the neighborhood, we should consider. But creating something that would take on consciousness doesn't even appear on the radar yet.

I don't think stuff like the Moon landing are in the same category though. That is just a matter of mechanics and logistics; all we had to do was throw enough money at it and TA-DA, we were on the moon. Consciousness seems to take on almost mystical requirements. No amount of money or effort would deliver a path to a certain solution.

Dragon Star
2005-Dec-22, 05:35 PM
True, but flying to the moon 100 years ago was hardly believable either was it?

In terms of robots developing a sense of self, isn't the point that when we can actually do that, its waaaaaaaaay to late for the debate by then, we need to be debating that eventuality before it happens, rather than trying to put the genie back in the proverbial bottle in light of a self-conscious robot. You can never put the genie back in the bottle you know.

Personally, and I have expressed this elsewhere, I believe its inevitable at some stage in the future that we'll develop robots to such a degree that these all become very valid questions to be considering and answering, though human nature being what it is, I don't hold out any hope of a consensus, I mean, imagine the advantages of having a robot army!

As far as the whole debate goes though, I don't think we have anything to worry about, I think it'll be late 21st through to 22nd century before any of these become really contemporary issues, so I guess in that sense, having the debate now is somewhat premature as we don't know what other capabilities we'll be developing over the course of this century.

But mark my words, there will come a time.........

:clap: Well said...

SolusLupus
2005-Dec-22, 06:09 PM
I doubt we will want a sapient "robot army". You don't need a machine that can think like a person if it's fighting; you just need an advanced autopilot (not TOO advanced, mind you).

I think that we will need artificial intelligence someday. A being that has a perfect memory, perfect multi-tasking skills, and a huge database can be extremely useful; whether as a statician, a "problem solver", or just something to give advice. I imagine that we'd use Artificial Intelligence more for information gathering and processing than anything else.

As for the whole "robot revolution", I'm very doubtful that machines would up and decide automatically, "Hey, we need to kill some humans!", and do it. It's pretty silly.

Dragon Star
2005-Dec-22, 07:13 PM
I As for the whole "robot revolution", I'm very doubtful that machines would up and decide automatically, "Hey, we need to kill some humans!", and do it. It's pretty silly.


That's just it thought, it's not "hey, we need to kill some humans", but if we make them defensive for their own survival, they would view us as a threat. They could never win a war against humans unless they had Shields that could protect them from EMP anyways, and I sure hope we don't make that mistake!:rolleyes:

Gillianren
2005-Dec-22, 07:49 PM
"I for one, welcome our new robot overlords!"

10 Points, For The Reference!!!!

:think:

Well, there's the obvious Kent Brockman "ant overlords" that gets hauled out every time something is stated to be taking over the world. Or did you mean something different? (Funnily enough, your quote's missing a comma. As in, "I, for one . . . .")

As to the robot insurance thing--Sam Waterston apparently read his lines as quickly and as dead-pan as possible. When they questioned him on it, he said that he figured any actor willing to prostitute himself by doing ads for robot insurance would be in it for the check, and would therefore want to get it over with as quickly as possible. The director conceded the point; the delivery stayed.

ZaphodBeeblebrox
2005-Dec-22, 07:57 PM
Well, there's the obvious Kent Brockman "ant overlords" that gets hauled out every time something is stated to be taking over the world. Or did you mean something different? (Funnily enough, your quote's missing a comma. As in, "I, for one . . . .")
Yes ...

Also, a Poster here, Has That, EXACT Signature, Bad Punctuation, Included!


As to the robot insurance thing--Sam Waterston apparently read his lines as quickly and as dead-pan as possible. When they questioned him on it, he said that he figured any actor willing to prostitute himself by doing ads for robot insurance would be in it for the check, and would therefore want to get it over with as quickly as possible. The director conceded the point; the delivery stayed.

Leave It, To Sam ...

teri tait
2005-Dec-23, 01:13 AM
"...Every silver lining has a touch of grey..."

Ten of Zaphods' points if you know the reference! (Teri wrestles Zaphod down and brutally violates his points stash...)

wayneee
2005-Dec-23, 01:49 AM
I'm reminded of an old Saturday Night Live fake commercial for robot insurance (it was insurance against attacks by robots, not insurance for robots). Thanks to Google and the internet, here it is! (http://www.robotcombat.com/video_oldglory_hi.html)
I thought of that scit too, that was funny:razz:

Van Rijn
2005-Dec-23, 02:06 AM
Now as far as AI goes, do we need it? I mean really, robots are tools, we want them to do specific things. Is making them able to learn and make decisions really going to improve anything? We don't really need it to learn, we know what we want it to do and tell it to do so. Decision making can definitly be problematic, what if the robot decides to quit working on it's task and do something else? If it can think, and make decisions on it's own, is it really moral to treat it like a slave? Sure we made it, but parents make their children. Parents are not allowed to use their children for slave labor.

Beyond games and some kinds of training simulations, I just don't see much need for AI in computers. I definitly don't see a whole lot of benefit to be had from making intelligent robots (other than for entertainment purposes).

What do you want a robot to do? Would you like a robot that you could tell (or type on a keyboard) "Go wash the dishes" and it understood the command well enough to find dishes, get the cleaner and do whatever else was necessary to actually wash the dishes?

The fact is, it would require a very impressive AI to do almost any of the tasks we often consider menial. AI is required for nearly any level of voice recognition, visual identification, significant decision making, movement and manipulation.

SolusLupus
2005-Dec-23, 02:07 AM
That's just it thought, it's not "hey, we need to kill some humans", but if we make them defensive for their own survival, they would view us as a threat. They could never win a war against humans unless they had Shields that could protect them from EMP anyways, and I sure hope we don't make that mistake!:rolleyes:

I find this logic to be highly suspect, personally.

Dragon Star
2005-Dec-23, 02:11 AM
I find this logic to be highly suspect, personally.

My logic or the logic of surivial for AI? Or both?

SolusLupus
2005-Dec-23, 02:14 AM
My logic or the logic of surivial for AI? Or both?

Both.

AIs would be created under a very specific set of circumstances. We would not give them free reign unless we fully understood them. Hell, the paranoia that Hollywood has about machines would probably aid that; there's been nothing but "evil machines" for a long time now, outside of the rare exception here and there (Matrix, Terminator, the butcher "I, Robot" Movie, etc.)

Added to that, it would be ludicrous to attach the new, highly experimental AI to the main defense grid that protects a city, a nation, or a planet. Like I said, we would wait until we understood them; and we would understand them, since we created them. It's not like we would construct something, then go, "Hey, how did it work again...?"

The whole "AI running away from us!" scenario has always seemed idiotic to me. But hey, whatever you like. *Shrugs*

Dragon Star
2005-Dec-23, 02:42 AM
Both.

AIs would be created under a very specific set of circumstances. We would not give them free reign unless we fully understood them. Hell, the paranoia that Hollywood has about machines would probably aid that; there's been nothing but "evil machines" for a long time now, outside of the rare exception here and there (Matrix, Terminator, the butcher "I, Robot" Movie, etc.)

Added to that, it would be ludicrous to attach the new, highly experimental AI to the main defense grid that protects a city, a nation, or a planet. Like I said, we would wait until we understood them; and we would understand them, since we created them. It's not like we would construct something, then go, "Hey, how did it work again...?"

The whole "AI running away from us!" scenario has always seemed idiotic to me. But hey, whatever you like. *Shrugs*

I get the feeling that you think no matter what I am going to disagree with you, but I agree with what your saying to a point. I just think that some things need to be clarified with what we mean...

I think my definition of AI is a bit different from yours, My AI is self learning and has an imagination. With this type of AI, you MUST give it laws, you can't just expect something that can learn all on it's own to just say, "meh...thats bad, I shouldn't do that.." because they will probably learn from trial and error basis, it might sound more like "Wonder what happens when I do this...". With the laws are set into place and programed into the CPU, when the bot goes outside of the laws, it becomes immobile or terminates to keep everything safe. And that seems perfectly logical to me.

It sounds like to me your AI is told exactly what to do at every moment, and that's not AI, you say that we would understand them, thats like saying you understand everything a young child is thinking, no?

So if you would clarify what kind of AI your talking about so I can get an idea of what you mean?

wayneee
2005-Dec-24, 04:42 AM
I am completely with Lone Wolf on this. I read the whole Robot Series, and the whole Foundation series by Azimov. I Robot was an integral book in these integrated series. The Movie was a travesty to the book, and ruined any subsequent Azimov robot movies. If he were alive he would have Pimp Slapped the Director.

On a positive note , even though Robin Williams was involved, Bicentenial Man was done quite well. It proves that a movie can mimic the book and still be good. I am tired of Hollywood dumbing down Science fiction.

Another Movie that stunk was Contact. Why the Romance, how about the hidden message in Pi. No wonder they wait till the Authors die before the make the movie.

SolusLupus
2005-Dec-24, 06:02 AM
I get the feeling that you think no matter what I am going to disagree with you, but I agree with what your saying to a point. I just think that some things need to be clarified with what we mean...

Granted.


I think my definition of AI is a bit different from yours, My AI is self learning and has an imagination. With this type of AI, you MUST give it laws, you can't just expect something that can learn all on it's own to just say, "meh...thats bad, I shouldn't do that.." because they will probably learn from trial and error basis, it might sound more like "Wonder what happens when I do this...". With the laws are set into place and programed into the CPU, when the bot goes outside of the laws, it becomes immobile or terminates to keep everything safe. And that seems perfectly logical to me.

I agree, to an extent.


It sounds like to me your AI is told exactly what to do at every moment, and that's not AI, you say that we would understand them, thats like saying you understand everything a young child is thinking, no?

Think of it this way: If we created it, and gave it the capabilities of thought, then we would have the capabilities to monitor and display that thought. Whether by code or whatever; if we understand the cause, we would want to understand the effect before we even start things. If we understand the effect, and the cause, then it's not quite easy for our creations to "run away from us".

Not that I'm saying that it's impossible; but I doubt it would be as easy as the movies make it out to be. It's like they started something they didn't understand, then connected it to the Main Defense Grid or whatever, and then were surprised when their creations got away from them; a classic example would be the Terminator storyline.


So if you would clarify what kind of AI your talking about so I can get an idea of what you mean?

I meant Sapient AI. Artificial Intelligence with equivalent intelligence to humans (though they would have an advantage, like a literally photographic memory)

Yoshua
2005-Dec-24, 08:42 AM
What do you want a robot to do? Would you like a robot that you could tell (or type on a keyboard) "Go wash the dishes" and it understood the command well enough to find dishes, get the cleaner and do whatever else was necessary to actually wash the dishes?

The fact is, it would require a very impressive AI to do almost any of the tasks we often consider menial. AI is required for nearly any level of voice recognition, visual identification, significant decision making, movement and manipulation.

Explain. Because I am not seeing where any sort of sophisticated intelligence would be required to do something like washing dishes. It could be handled very easily by scripting. Sure it'd be more complicated than a dish washing machine, but hardly need something capable of creativity and decision making.

I suppose it depends on your definition of AI.

teri tait
2005-Dec-24, 01:55 PM
Cracks me up, the fear of AI. This is the type of ignorance that set science waaay back!
Archimedes prolific and intense grasp of higher math came and went because no one bothered to understand his invaluable cotribution to calculus...it is estimated human venture into space travel could have happened about 300 years earlier if his work had been reviewed and comprehended.

Dragon Star
2005-Dec-24, 06:06 PM
Cracks me up, the fear of AI. This is the type of ignorance that set science waaay back!
Archimedes prolific and intense grasp of higher math came and went because no one bothered to understand his invaluable cotribution to calculus...it is estimated human venture into space travel could have happened about 300 years earlier if his work had been reviewed and comprehended.

The fear of AI? Qft...I fear man and his ability. I want AI to be with us, but I just stress the problems it could have, and the ways to keep it safe.


Think of it this way: If we created it, and gave it the capabilities of thought, then we would have the capabilities to monitor and display that thought. Whether by code or whatever; if we understand the cause, we would want to understand the effect before we even start things. If we understand the effect, and the cause, then it's not quite easy for our creations to "run away from us".

Not that I'm saying that it's impossible; but I doubt it would be as easy as the movies make it out to be. It's like they started something they didn't understand, then connected it to the Main Defense Grid or whatever, and then were surprised when their creations got away from them; a classic example would be the Terminator storyline.

OK Lonewulf, I think we pretty much agree with both of our views then, but I still feel like you underestimate sapient AI a bit too much.

I feel the main problem AI is not the programing, but the REPROGRAMMING, and that is a huge issue. I could take a AI robot apart, and give it a license(sp?) to kill, so how do we prevent that?:think:

Yoshua
2005-Dec-24, 09:05 PM
Cracks me up, the fear of AI. This is the type of ignorance that set science waaay back!
Archimedes prolific and intense grasp of higher math came and went because no one bothered to understand his invaluable cotribution to calculus...it is estimated human venture into space travel could have happened about 300 years earlier if his work had been reviewed and comprehended.

I have no fear of AI's or machines. But I see AI's as only ever being a curiousity or toy with little to no practical value. I can't think of anything where I would actually want a creative, decision making computer in charge of things.

And yeah, a lot of science has been held back by religious and other superstitious beliefs. Mostly, it seems, because for a long time religion viewed science and education as enemies to be fought. (just my opinion, not trying to pick a fight here)

LurchGS
2005-Dec-25, 04:40 AM
oh.. an AI would be a prime choice for exploration craft. Scripting is great when you are looking FOR something, but worth squat if you find something unexpected.

HenrikOlsen
2005-Dec-25, 10:49 AM
The main problem with scripting is that you have to prepare for every eventuality in advance.
Make a maid robot with scripting and no intelligence, and you'll find the day where you get toasted newspaper and a furious cup of cat for breakfast.

SolusLupus
2005-Dec-26, 07:28 AM
I have no fear of AI's or machines. But I see AI's as only ever being a curiousity or toy with little to no practical value. I can't think of anything where I would actually want a creative, decision making computer in charge of things.

Funny, I can think of hundreds of things that AIs would be incredibly useful for, with all the practical value. Thing is, unless you program a machine to be subjective, it will be objective. Objective means that there is no room for bribes, there is no need to worry about greed, and there is no worry that emotion will override common sense.

A machine can be in charge of a lot of things; it can make a good scientist, mathematician, a studier, a thinker, and a Problem Solver.

Having something objective, with a perfect memory (isn't forgetful, doesn't make "memory" mistakes), and a high memory capacity, would be very useful in several different fields. Having perfect accuracy is also highly desired in fields such as engineering, though there is no real need to have perfect sapiency in that case.