PDA

View Full Version : Sentient computer vs. sentient computer program



Ilya
2007-Jul-06, 07:47 PM
This post (http://www.bautforum.com/showthread.php?p=1024987#post1024987) of mine (about sentient computer program which is afraid of being deleted) reminded me of the following:



Good afternoon, gentlemen. I am a HAL 9000 computer. I became operational at the H.A.L. plant in Urbana, Illinois, on the 12th January 1997. My instructor was Mr Langley, and he taught me to sing a song. If you'd like to hear it, I can sing it for you.


When "2001: Space Odyssey" was filmed, hardly anyone in the world understood what software is -- that it is endlessly malleable, easy to replicate, and largely independent of hardware. The above quote meant, without any doubt, that the specific computer was turned on (and presumably started learning) on the 12th January 1997. The idea that once that learning was complete, HAL's memory content could be copied into another HAL simply would not occur to 1969 movie audience -- let alone the idea that a program is sentient, not the "computer".

When did SF writers first become aware of the difference between hardware and software, and when did the concept of "sentient program" arise?

nauthiz
2007-Jul-06, 08:30 PM
I doubt Arthur C Clarke was unaware of the difference between hardware and software. Stored program computers had been around for quite a while at that point. I'd guess the reason that HAL was presented that way is for literary reasons. Giving him a birthdate and a childhood makes him more compelling as a literary character; it creates some tension between his being a machine and his having a conscious experience that isn't all that different from our own. The scene where he's being switched off wouldn't be anywhere near as powerful if he had come off an assembly line.

Noclevername
2007-Jul-06, 08:31 PM
When did SF writers first become aware of the difference between hardware and software, and when did the concept of "sentient program" arise?

The earliest example I know of is from Time Enough For Love, where Heinlein talks about a sentient computer copying its personality into a new machine. 1973, IIRC.

Ilya
2007-Jul-06, 08:49 PM
I doubt Arthur C Clarke was unaware of the difference between hardware and software. Stored program computers had been around for quite a while at that point.

I am sure A.C. Clarke was aware of the difference -- but most movie audiences were not.

selden
2007-Jul-06, 11:17 PM
While it is possible to translate a program from one general-purpose computer architecture to another in a reasonable amount of time, there are circumstances when one would like to be able to create a non-copyable program/hardware combination.

No, I don't mean something to enforce the DMCA :)

What about a self-modifying hardware/software combination?
Consider a specialized hardware platform -- a highly complex one that the software modfies at the lowest component level for improved performance as it learns. (Based on something like FPLAs, perhaps).

It seems to me that this could be used to explain Hal's being identified as a specific computer.

Certainly if it takes longer to analyze the resulting hardware/software configuration and then duplicate it than it takes the software to modify that combination, there's not much point in trying to replicate the final configuration for mass production of identical units. You might as well make lots of them and teach them in parallel.

It seems to me that this combination of complex self-modifying hardware and software also could be taken as a description of the various organic assemblies interacting here. ;)

nauthiz
2007-Jul-06, 11:30 PM
Might also be the case that HAL couldn't modify his own hardware, but he was constructed in such a way that it was impossible to copy his internal state. While most modern computers are constructed in such a way that it's possible to copy every single bit (with the exception of some internal storage in the CPU and stuff like that) to an external device, that's a feature that has to be consciously built in. It might not have been an economical feature to design into HAL's hardware or something like that.

eburacum45
2007-Jul-07, 06:10 AM
The fact that a hardware/software combination is self modifying doesn't necessarily mean that it can't copy itself.
Admittedly our own hardware/software combo in our bbrains can't duplicate itself, but that is because we don't monitor the individual state of each of our neurons and synapses. It has not been necessary for our brains to evolve that ability over the last few billion years.

But it could be conceivable that a self modifying, learning computer/program combo could keep a running record of its own internal state; if it is necessary to replicate the machine and program at any point that running record could be downloaded- and the state of the machine at that particular instant could be recreated. Of course the copy would immediately start to diverge from the original, as it proceded to modify itself further.

If the machine concerned was a quantum computer I think there might be problems with the no-cloning theorem, making it impossible for certain information to be replicated exactly. But I might be quite wrong here.

Roy Batty
2007-Jul-09, 10:48 AM
John Brunner wrote a sci-fi novel called The Jagged Orbit in 1969. Trying not to give away too many spoilers but in it, is a principle character that is an AI computer (semi) copy inside a person that's time.... no, that's too much :)

phunk
2007-Jul-09, 03:38 PM
The fact that a hardware/software combination is self modifying doesn't necessarily mean that it can't copy itself.
Admittedly our own hardware/software combo in our bbrains can't duplicate itself, but that is because we don't monitor the individual state of each of our neurons and synapses. It has not been necessary for our brains to evolve that ability over the last few billion years.

But it could be conceivable that a self modifying, learning computer/program combo could keep a running record of its own internal state; if it is necessary to replicate the machine and program at any point that running record could be downloaded- and the state of the machine at that particular instant could be recreated. Of course the copy would immediately start to diverge from the original, as it proceded to modify itself further.

If the machine concerned was a quantum computer I think there might be problems with the no-cloning theorem, making it impossible for certain information to be replicated exactly. But I might be quite wrong here.

Actually, in some cases, copying the state isn't enough. In some examples of evolutionary algorithims run on reconfigurable hardware, the resulting configuration can end up exploiting subtle defects of the specific hardware it was running on. For example, I remember one case I read about where they were simply trying to get an fpga circuit to recognize tones using an evolutionary algorithm. They got it working perfectly... until they copied it to a second fpga, and it failed miserably. The end result of the 'evolution' was using a wierd analog feedback loop that only worked on the original circuit.

Doodler
2007-Jul-10, 01:12 PM
Could also be that part of HAL wasn't software, but firmware.

BigDon
2007-Jul-11, 12:57 AM
I am the only person I know who cries during the shutting off of the higher functions scene.

I can't help it because I know exactly what it feels like.

Having epilepsy, the freakin witch doctors at the VA gave me all kinds of stuff to control my seizures back when I trusted them. As long as you aren't seizing, the docs don't care if you end up with the mind of a four year old. Get this, I would sieze, go to the hospital (Until I learned better) and they would up the dosage without consulting each other. (Oh no! That never happens!)

Get this. I ended up taking the maximum dose of Dilantin combined with two and a half times more phenabarbatol than the doctor who fixed all this had ever heard of anybody taking. And he taught neurology as Stanford. (Not a VA doctor) I was on that regimen for years.

I missed most of the 80's on that crap and was nearly consigned to a nursing home. And Dilantin is a big one for that. You can actually feel your IQ drain away over the intial few days until you don't care anymore. And thats pretty low BTW. Not caring that you are losing your intellect.

And folks wonder why I won't take anti-seizure meds or won't let them open my head. I don't trust them. Not a drop, no confidence whatsoever. And not even for the reasons listed above. The final straw was too painful/personal to retell on a world wide forum.

So yeah, I cry during that scene. Can't help it.

BD

randycat99
2007-Jul-11, 03:52 AM
Wow! That is quite a dramatic reveal! :o

SkepticJ
2007-Jul-11, 06:08 AM
The above quote meant, without any doubt, that the specific computer was turned on (and presumably started learning) on the 12th January 1997.

Just a nitpick, I think the year was 1992, not 1997.

ASEI
2007-Jul-11, 10:14 AM
At the time though, the ratio of hardware to software involved in computers, and the fact that the software was usually machine specific, written to operate on specific devices, slow enough to run, much less copy or transmit, probably made software's characteristics less obvious than they are today with our highly general computers. Back then, there was much less state, and much more wires and solder.

Ilya
2007-Jul-11, 05:05 PM
Just a nitpick, I think the year was 1992, not 1997.

In the book it is 1992. In the movie, 1997.

Ilya
2007-Jul-11, 05:07 PM
At the time though, the ratio of hardware to software involved in computers, and the fact that the software was usually machine specific, written to operate on specific devices, slow enough to run, much less copy or transmit, probably made software's characteristics less obvious than they are today with our highly general computers. Back then, there was much less state, and much more wires and solder.

Of course. But that's part of the reason very few non-specialists were familiar with the concept of "software."

PhantomWolf
2007-Jul-12, 04:00 AM
I think that Skynet is a good example.

Skynet was brought online on August 4th, 1997 and was given control over the U.S. strategic nuclear arsenal for reasons of efficiency, and programmed with a directive of defending the United States against all possible enemies. It started to learn at a geometric rate, and soon concluded that its greatest threat was humanity itself. It then decided mankinds' fate in a microsecond: extermination. It launched a nuclear war which destroyed most of the human population, and initiated a program of genocide against the survivors.

Intially it was indicated that Skynet was an AI processor of some sort, that it was an actual computer with the AI Logic built into it, like HAL. Later (T3) it was shown that in fact Skynet was software and had not actual core system, but rather spread itself over a vast network of computers.

HenrikOlsen
2008-Aug-31, 09:41 PM
The earliest example I know of is from Time Enough For Love, where Heinlein talks about a sentient computer copying its personality into a new machine. 1973, IIRC.


John Brunner wrote a sci-fi novel called The Jagged Orbit in 1969. Trying not to give away too many spoilers but in it, is a principle character that is an AI computer (semi) copy inside a person that's time.... no, that's too much :)
Sorry about the bump, but a recent reference to this tread made me think of another, even earlier, Zelazny's For a Breath I Tarry (http://www.kulichki.com/moshkow/ZELQZNY/forbreat.txt), from 1966, though the central idea here was that moving the software to very different hardware resulted in different behavior (that should be a nicely unspoilerish description, full novelette in link).

jokergirl
2008-Sep-01, 08:30 AM
I wonder.

There are a few examples of computers in literature that spontaneously become sentient by virtue of having more hardware (transistors, memory, "neurons") added to their system. The Bank in Dark Side of the Sun and the computer in The Moon is a Harsh Mistress are examples.

Personally, I don't think this is very likely, but I do know there are some interesting self-learning algorithms around, and a system that is programmed to expand when it sees the possibility to grow could likely take over this new hardware and make it learn.
A while ago, they tried to design a new circuit by using an evolutionary algorithm to change the actual physical makeup of the circuit board - it would be re-etched, then the new board was run through the program again until the desired behaviour was achieved. It turned out that the software was creating the board very unlike a human would have - it made use of material impurities and leakage current that were very hard for the engineers to understand.
Such a program would probably not be transferrable even if you copied the hardware and software makeup precisely (barring transporter malfunction-clones).

In this case, I could imagine HAL being built on an artificial neural network system - he'd be theoretically copy-able, but he does show a "personality" in things like choosing his own voice (SAL mentions that in 2010).
If his conscience has "grown" by itself like that, even copying the network bit by bit could result in a different personality later on (just like identical twins grow up different).

;)

hhEb09'1
2008-Sep-01, 08:53 AM
Such a program would probably not be transferrable even if you copied the hardware and software makeup precisely (barring transporter malfunction-clones).And, it would not be transferable if the computer were one-of-a-kind. There have been such animals.

And, if BigDon is hanging around this old thread, I'd like to know how to account for Boo if he missed the 80's. No, wait, don't tell me...

Roy Batty
2008-Sep-01, 09:46 AM
Sorry about the bump, but a recent reference to this tread made me think of another, even earlier, Zelazny's For a Breath I Tarry (http://www.kulichki.com/moshkow/ZELQZNY/forbreat.txt), from 1966, though the central idea here was that moving the software to very different hardware resulted in different behavior (that should be a nicely unspoilerish description, full novelette in link).

What a great story, thanks for that! :cool:

ravens_cry
2008-Sep-01, 11:34 PM
Sorry about the bump, but a recent reference to this tread made me think of another, even earlier, Zelazny's For a Breath I Tarry (http://www.kulichki.com/moshkow/ZELQZNY/forbreat.txt), from 1966, though the central idea here was that moving the software to very different hardware resulted in different behavior (that should be a nicely unspoilerish description, full novelette in link). I agree with Roy Batty, thank you. There was a few errors in the transcription, but none that hindered the enjoyment.The story itself was wonderful. Strong hints of the Book of Job.

novaderrik
2008-Sep-02, 08:16 AM
Intially it was indicated that Skynet was an AI processor of some sort, that it was an actual computer with the AI Logic built into it, like HAL. Later (T3) it was shown that in fact Skynet was software and had not actual core system, but rather spread itself over a vast network of computers.

by the time T3 came out, everyone was well aware of what the internet is and had a basic grasp on how it operates. that wasn't the case with the original- when the Aplle IIe (was it even out yet??) was the high tech mainstream machine- or T2, when the 286 processor ruled the world and the only people that were "online" were the geeks and nerds that had access in their college computer labs.

stutefish
2008-Sep-02, 05:38 PM
What if a critical component of AI is special purpose hardware?

Maybe the "software" necessary to enable or emulate AI exploits specific properties of specific kinds of circuitry.

Implementing the human brain in transistors--even today's microminiaturized specimens--seems pretty far-fetched.

Seems to me like a brilliant piece of adaptive software, running on a superfast bus on a next-generation, custom-built supercomputer cluster, might theoretically "port" itself to the Internet, but would experience such a drastic slowdown in cognition time that its overall functionality would fall apart.

Maybe you're either writing the software for the hardware, or you're building the hardware for the software.

Even if the supercomputer-specific AI managed, through its supreme cleverness and adaptability, to "port" itself to the Internet, it would probably have to make so many changes, in order to accomodate the extreme differences in its operating environment, as to be a totally different kind of intelligence from its progenitor.

ravens_cry
2008-Sep-02, 09:36 PM
Another problem I had with the T3 Skynet Split was that while sure, if it could somehow keep itself together, distributed computing is been done, then it would be 'invincible' UNLESS civilization ended. This would surely knock almost out all its 'cells' And what does it do once it has distributed itself like this? It wipes out civilization. The stupid felt moon hoax rateable.

stutefish
2008-Sep-02, 10:33 PM
Well, Skynet did have access to future designs for highly-adaptive, self-replicating machines.

Once human civilization is gone, it can rebuild the world in its own image, at its leisure, so long as it leaves a certain minimum amount of vital infrastructure intact.

I'm sure that part of what future-Skynet sent back to its juvenile self was the results of of a detailed analysis of exactly how much of civilization, and which bits, in which order, needed to be destroyed, in order to optimally balance the destruction of humanity as a threat with the necessary minimum infrastructure for the Machine Age To Come.

Roy Batty
2008-Sep-02, 10:53 PM
And all I have to say is, 'Recursion: See recursion' :D

ravens_cry
2008-Sep-02, 10:54 PM
Yes, but the idiot copied itself onto the INTERNET. The electromagnetic pulse ALONE would have done it in.