PDA

View Full Version : In Space No One Can Hear You Overclock



Tuckerfan
2004-May-25, 11:11 PM
A buddy and I were talking about various methods of cooling PCs from air cooling to submerging it into liquid nitrogen (which evidently gives it one heck of a performance boost) today, and I happened to think of something. Objects in space, which aren't illuminated by the sun quickly reach a temperature near absolute zero. Ping! Goes the lightbulb above my head. What would happen if you stuck a PC in the shadow of something in space? Would it's temp drop down to near absolute zero and stay reasonably close to that temp with the PC fired up and running, or would the vacuum of space prevent the PC from cooling rapidly enough to stay below the melting point?

JustAGuy
2004-May-25, 11:14 PM
I imagine that the heat radiated by the processor would be sufficient to cool it, but you might have to take the motherboard out of the case to prevent the case from radiating the heat right back at it. All this, of course, assumes that the motherboard could operate in a vacuum.

The CRT is another story, however.

Bob
2004-May-25, 11:36 PM
1. The semiconductors in your computer will turn into inert rocks long, long before you get to absolute zero.
2. Where is the performance boost with reduced temperature? The clock speed, amount of memory, and other factors defining performance are constant or nearly so as a function of temperature.
3. With ideal shielding you could cool your computer to no lower than 2.76 K, the CMB temperature.

Tuckerfan
2004-May-25, 11:49 PM
1. The semiconductors in your computer will turn into inert rocks long, long before you get to absolute zero.
2. Where is the performance boost with reduced temperature? The clock speed, amount of memory, and other factors defining performance are constant or nearly so as a function of temperature.
3. With ideal shielding you could cool your computer to no lower than 2.76 K, the CMB temperature.Well, these guys cooled an Intel 2.2 gHz processer using liquid nitrogen (http://www.muropaketti.com/artikkelit/cpu/northwood2200/ln2/) and got a bit of a performance boost
The highest STABLE CPU clock frequency we were able to reach was 3630MHz (FSB 165MHz). At 3650MHz we were able to run heavy benchmark programs such as SuperPi and Pifast successfully although the VCore was quite high (2,12V). It seems that Pentium 4 can handle it without any conflicts.Other folks have done it as well, but I can't find a site for them at the moment.

ToSeek
2004-May-26, 12:13 AM
I would think that since you're in a vacuum, you would not lose heat very quickly at all. (Keep in mind that thermos bottles work basically on this principle.) Since the CPU chip is generating heat at a goodly rate, you'd probably melt rather than freeze.

JustAGuy
2004-May-26, 12:17 AM
Ah, but a thermos is designed to do the exact opposite of a cpu: prevent the loss of heat via radiation... hence all the reflective foil, and the insulation in addition to the vacuum layer.

Of course, this is all moot. Assuming you had a cpu in space, it would be far, far, far, far (enough fars?) cheaper to cool it via existing, proven technologies, and then worry about cooling the spaceship environment (also through existing, proven technolgoies)

(Hmm... now that I think about it, the CPU is designed to maximize lose of heat via convection to the heatsink... oh well)

Tuckerfan
2004-May-26, 12:29 AM
Ah, but a thermos is designed to do the exact opposite of a cpu: prevent the loss of heat via radiation... hence all the reflective foil, and the insulation in addition to the vacuum layer.

Of course, this is all moot. Assuming you had a cpu in space, it would be far, far, far, far (enough fars?) cheaper to cool it via existing, proven technologies, and then worry about cooling the spaceship environment (also through existing, proven technolgoies)

(Hmm... now that I think about it, the CPU is designed to maximize lose of heat via convection to the heatsink... oh well)Well, that's a "maybe." The temps of processors is rapidly increasing to the point where water cooling is going to be a necessity within a few years if they don't find a way to reduce the heat generated (I've seen a newsreport which made the claim that within five years they'll produce as much heat as a nuclear reactor.), so if you've got something like space which is a huge heat sink. It will slowly suck the heat away from you, it might be just as economical to stick the PC on the outside of the space craft.

The question is: How fast will space suck heat with no sunlight? I'm sure that NASA has the formulas for figuring such things out. Perhaps someone here might know of them?

Patrator
2004-May-26, 12:50 AM
Well, these guys cooled an Intel 2.2 gHz processer using liquid nitrogen (http://www.muropaketti.com/artikkelit/cpu/northwood2200/ln2/) and got a bit of a performance boost
The passive components on that board were not cooled and I bet a fair few of them would not function correctly (if at all) were they cooled to
-196C. Having done a fair bit of research into inductor and transformer materials in a previous job, I can definitely say that these would not work at such low temperatures (at least not the way they're supposed to). What about the resistors, capacitors, diodes etc.
So I think it's safe to say a computer in space will cease to function (if it's temperature dropped to 2.7K).

How long will that take? Would it happen before something melted? Don't know.

Andreas
2004-May-26, 01:29 AM
Where is the performance boost with reduced temperature? The clock speed, amount of memory, and other factors defining performance are constant or nearly so as a function of temperature.
Lower temperature allow semiconductors to run higher frequencies. The computer won't get faster if you don't use that to overclock it, of course.


The temps of processors is rapidly increasing to the point where water cooling is going to be a necessity within a few years if they don't find a way to reduce the heat generated
I'd bet it becomes a necessity. I remember Usenet posts from a many years back mocking the Alpha processor for its energy hunger and heat production -- it even needed its own fan! :wink:


so if you've got something like space which is a huge heat sink. It will slowly suck the heat away from you, it might be just as economical to stick the PC on the outside of the space craft.
Nope. Thermal resistance and heat transport is very critical. Current high end CPUs require heat sinks where at least the base plate is made of copper to transport the heat off the die fast enough. Then there is the huge rest of the heat sink attached with a fan that keeps lots of cool air flowing over it.

Should the fan stop and the heat sink rely on simple convection, the CPU will overheat in less than a minute. Now with vacuum there's only radiation to get the heat away, obviously standard cooling equipment won't do here. The heat sink would have to be big enough to constantly radiate 60 to 80 Watts (or whatever it currently is) of heat into space while keeping the CPU within operating range (less than 60 to 90 C, depending on type).

That you can plug into formulas (assume a single point of heat production of given power that is to be kept below given temperature), I just don't know them. :) I'd bet you get a pretty big heatsink just for not overheating, even more to overcool it to allow serious overclocking.

Fortis
2004-May-26, 02:26 AM
What you need is the Stefan-Boltzmann law.

The power radiated per unit surface area for a surface with emissivity, e, at a temperature T, is given by,

Power=Sigma*e*T^4 ,

where Sigma is Stefan's constant, 5.67x10^-8 W m^-2 K^-4

Now the difference between the power radiated, and the power absorbed from the CMB at 2.7 K, is the power that you're generating in your CPU that you want to dissipate.

If we assume that the radiator is a Blackbody (e=1), and that the steady state temperature of the thing is 90 DegC (it would probably be a bit colder than that, but hey, we're doing "back of the envelope here ;) ), and it is dissipating 100 W of energy into space, then it would need to have a radiating area of

~0.1 m^2

(A website giving an on-line calculation is here,
http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/stefan.html#c3 )

Which doesn't seem too bad. (It's still an expensive way to do it. Thermal control on satellites is an absolute nightmare :eek: Thank God for MLI ;) )

(I'm probably going to have embarrassed myself by committing numbers to a post that are completely incorrect. ;) )

ToSeek
2004-May-26, 02:33 AM
then it would need to have a radiating area of

~0.1 m^2

Which doesn't seem too bad. (It's still an expensive way to do it. Thermal control on satellites is an absolute nightmare :eek: )

That's a tenth of a square meter, right? So we're talking a 10cmx50cm (since it has two sides) heat sink. Seems kind of big to me for a chip smaller than my thumb.

Tuckerfan
2004-May-26, 02:39 AM
Interesting. I cross posted this on another message board (http://boards.straightdope.com/sdmb/showthread.php?p=4897424) and a poster who claimed to work on satellites chimed in and stated that quite often they have to heat the electronics to keep them operating at a standardized temperature. Now, he doesn't say which electronics it was that they had to keep warm, so it might be that the processor isn't one of them. Of course, a lot depends upon what type of processor you're running, since as Andreas pointed out older processors don't run nearly as hot as newer ones. The Hubble is running a 486, I don't know what anything else is running.

Tuckerfan
2004-May-26, 02:48 AM
then it would need to have a radiating area of

~0.1 m^2

Which doesn't seem too bad. (It's still an expensive way to do it. Thermal control on satellites is an absolute nightmare :eek: )

That's a tenth of a square meter, right? So we're talking a 10cmx50cm (since it has two sides) heat sink. Seems kind of big to me for a chip smaller than my thumb.Yeah, that is, depending upon how you measure everything, though. I don't know what the surface area of a standard PC heatsink is (unless I missunderstand Fortis, he's talking about the total surface area for the heat sink, and not simply that a heatsink would have to have a footprint that size), but I'd imagine that the total radiating area of a typical watercooled PC (tubes, radiator, etc.) probably comes in around that.

New breakthroughs in nanotubes (http://www.technologyreview.com/articles/rnb_051404.asp?trk=nl) and water cooling (http://www.cooligy.com/technology.html) might change all that, however.

Fortis
2004-May-26, 02:49 AM
It just shows how inefficient radiation is as a means of cooling.

Swift
2004-May-26, 02:46 PM
Interesting. I cross posted this on another message board (http://boards.straightdope.com/sdmb/showthread.php?p=4897424) and a poster who claimed to work on satellites chimed in and stated that quite often they have to heat the electronics to keep them operating at a standardized temperature. Now, he doesn't say which electronics it was that they had to keep warm, so it might be that the processor isn't one of them. Of course, a lot depends upon what type of processor you're running, since as Andreas pointed out older processors don't run nearly as hot as newer ones. The Hubble is running a 486, I don't know what anything else is running.
One piece of electronics that is often heated is the clock. It is usually a piece of quartz that oscillates at a set frequency. The frequency it oscillates at has a temperature dependence (that dependence can either be very strong or very tiny, depending on the crystallographic orientation of the quartz). To keep it at a constant temperature, it is often put into an "oven". Its just a tiny metal box with a tiny piece of heater wire, a temperature sensor, and a controller to keep it all at a setpoint. The temperature is usually just a little above room temperature.

parejkoj
2004-May-26, 05:51 PM
I know that basically all instruments on Cassini have replacement heaters to keep them warm then they aren't operating, and the designs of the instruments are not based on getting rid of heat, but more on keeping a portion of it around. However, some of the bus units have slats that will open to allow more heat to radiate away, if they get too warm.

Also, rad-hardened processors aren't nearly as dense as commercial processors, so their heat output is much lower. Radiation is a much bigger worry with space bound equipment, not heating! A P4 wouldn't last more than a minute or two in space, and it would probably start getting single-event-upsets and data corruption within seconds.

Avatar28
2004-May-26, 06:05 PM
Also, rad-hardened processors aren't nearly as dense as commercial processors, so their heat output is much lower. Radiation is a much bigger worry with space bound equipment, not heating! A P4 wouldn't last more than a minute or two in space, and it would probably start getting single-event-upsets and data corruption within seconds.

Not true. Shuttle and ISS astronauts frequently had modern laptops on board. They didn't seem to suffer any problems. Granted, they're not doing anything mission critical on it and they've got some shielding provided by the station/shuttle itself. But they do work just fine apparently.



Well, that's a "maybe." The temps of processors is rapidly increasing to the point where water cooling is going to be a necessity within a few years if they don't find a way to reduce the heat generated (I've seen a newsreport which made the claim that within five years they'll produce as much heat as a nuclear reactor.), so if you've got something like space which is a huge heat sink. It will slowly suck the heat away from you, it might be just as economical to stick the PC on the outside of the space craft.


Actually, I believe they already are. Not quite to the temp of rocket exhaust yet, though. Note, that we're talking heat per unit area here, not absolute heat. A typical high end desktop CPU consumes between 75 and 100 watts at the most, usually a bit less. Obviously there's no way you can approach the heat of a nuclear reactor on that little power in terms of absolute output.

Demigrog
2004-May-26, 06:43 PM
Not true. Shuttle and ISS astronauts frequently had modern laptops on board. They didn't seem to suffer any problems. Granted, they're not doing anything mission critical on it and they've got some shielding provided by the station/shuttle itself. But they do work just fine apparently.

One of the commercial companies (TransOrbital I think) trying to send a craft to lunar orbit is including a normal HP PocketPC. The press release is vague, but they may actually be using it for the main CPU.

NASA is a bit more picky about their hardware reliability in mission critical apps; the RAD6000 they are using on most projects these days is impressively radiation hardened: http://parts.jpl.nasa.gov/mrqw/mrqw_presentations/Keynote1_haddad.pdf




Well, that's a "maybe." The temps of processors is rapidly increasing to the point where water cooling is going to be a necessity within a few years if they don't find a way to reduce the heat generated (I've seen a newsreport which made the claim that within five years they'll produce as much heat as a nuclear reactor.), so if you've got something like space which is a huge heat sink. It will slowly suck the heat away from you, it might be just as economical to stick the PC on the outside of the space craft.


Actually, I believe they already are. Not quite to the temp of rocket exhaust yet, though. Note, that we're talking heat per unit area here, not absolute heat. A typical high end desktop CPU consumes between 75 and 100 watts at the most, usually a bit less. Obviously there's no way you can approach the heat of a nuclear reactor on that little power in terms of absolute output.

We use water cooling frequently for industrial computers, particularly when we have an 80C operating temperature requirement. :) We also use heat pipes in some situations.

As for comparing heat output to a nuclear reactor... maybe the article was comparing to an RTG, not a reactor. Otherwise, you'd need a nuclear reactor to power your computer. :)

parejkoj
2004-May-26, 06:52 PM
Also, rad-hardened processors aren't nearly as dense as commercial processors, so their heat output is much lower. Radiation is a much bigger worry with space bound equipment, not heating! A P4 wouldn't last more than a minute or two in space, and it would probably start getting single-event-upsets and data corruption within seconds.

Not true. Shuttle and ISS astronauts frequently had modern laptops on board. They didn't seem to suffer any problems. Granted, they're not doing anything mission critical on it and they've got some shielding provided by the station/shuttle itself. But they do work just fine apparently.


Hmm... Inside the Shuttle or ISS would be fairly safe, but you certainly wouldn't want to use it for anything mission critical. I remember that they had a bunch of laptops on ISS (IBM, I think...). Though those two remain low enough to be protected by Earth's magnetosphere for most of the time. I think they might also replace them every several months or something, but I'm not sure. I wonder what their rates of single radiation events are.... Do they release that data? I bet the laptop manufacturers would love to have it. :-k

I guess I was thinking more along the lines of just putting the P4 out there, as was suggested above (for cooling). Unless you use some long heat pipes, it will be close enough to the "outside" to get fried.

zebo-the-fat
2004-May-26, 09:49 PM
The Hubble is running a 486, I don't know what anything else is running.

One reason for using an "old" 486 is that it is less likely to be damaged by radiation than a modern P4 because the spacing of the circuitry on the chip is much wider.

JustAGuy
2004-May-26, 09:55 PM
The Hubble is running a 486, I don't know what anything else is running.

I believe the MERs are running a hardened version of early PowerPC chip, but I'm not 100% on the model #.

Tuckerfan
2004-May-26, 09:59 PM
The Hubble is running a 486, I don't know what anything else is running.

One reason for using an "old" 486 is that it is less likely to be damaged by radiation than a modern P4 because the spacing of the circuitry on the chip is much wider.Another reason is (and the only official version I've heard, not that I doubt what you're saying) that in order to get the thing ready in time for launch, they had to "freeze" the design at some point, and when they did, the 486 was the processor then available. Also, NASA has been buying a lot of old and obsolete gear off of eBay simply because it's a known quantity. If you've got a mission critical system, you want hardware with as long a track record as possible. That way, if it fails, you have a shorter list of possible reasons for the failure, plus how to correct or compensate for them, and you're not wasting valuable time trying to figure out what died, why, and how to fix it.

Demigrog, it was a standard press service story, not a technical brief, so I've no idea what type of reactor the writer really meant.

swansont
2004-May-26, 10:24 PM
Another reason is (and the only official version I've heard, not that I doubt what you're saying) that in order to get the thing ready in time for launch, they had to "freeze" the design at some point, and when they did, the 486 was the processor then available.

From what I've heard there's also a tendency for space-qualified hardware to lag the rest of the field, simply because of the work and time required to space-qualify something.

Andreas
2004-May-26, 11:55 PM
What you need is the Stefan-Boltzmann law.

The power radiated per unit surface area for a surface with emissivity, e, at a temperature T, is given by,

Power=Sigma*e*T^4 ,

where Sigma is Stefan's constant, 5.67x10^-8 W m^-2 K^-4

Now the difference between the power radiated, and the power absorbed from the CMB at 2.7 K, is the power that you're generating in your CPU that you want to dissipate.

If we assume that the radiator is a Blackbody (e=1), and that the steady state temperature of the thing is 90 DegC (it would probably be a bit colder than that, but hey, we're doing "back of the envelope here ;) ), and it is dissipating 100 W of energy into space, then it would need to have a radiating area of

~0.1 m^2
Thanks, nice, but you'd be burning your CPU then. :wink: All the heat is generated on the tiny die of the processor and has to be transported off before it can build up. Meaning that the radiator would have to be kept much cooler than the hot spot so that the heat gradient is steep enough according to the material's thermal conductivity. That isn't as easy to plug into a formula since we don't have a uniform temperature.

And then Tuckerfan asked if it helps for serious overclocking, so actually we would want to keep the processor below normal operating temperature (like, say, below freezing). All this would increase the size of the radiating area, but I'm too lazy to guesstimate now.

ToSeek
2004-May-27, 12:18 AM
The Hubble is running a 486, I don't know what anything else is running.

I believe the MERs are running a hardened version of early PowerPC chip, but I'm not 100% on the model #.

The MERs use RAD6000s (http://www.iews.na.baesystems.com/space/rad6000/rad6000_sbsc.html), a rad-hardened version of the RS/6000, a predecessor chip (it says) to the PowerPC series. The RAD6000s and RAD750s are used a lot in spacecraft.

Tuckerfan
2004-May-27, 12:33 AM
Here's the site I was looking for on the LN2 cooled PC. (http://www6.tomshardware.com/cpu/20031230/) They got a P4 up to 5 Ghz, and it was a pretty intensive operation. They had parts of the heatsink custom made, and spent a lot of time selecting components and adjusting them. No screencaps of the onboard temp readings that I could find, however.

Fortis
2004-May-27, 01:01 AM
What you need is the Stefan-Boltzmann law.

The power radiated per unit surface area for a surface with emissivity, e, at a temperature T, is given by,

Power=Sigma*e*T^4 ,

where Sigma is Stefan's constant, 5.67x10^-8 W m^-2 K^-4

Now the difference between the power radiated, and the power absorbed from the CMB at 2.7 K, is the power that you're generating in your CPU that you want to dissipate.

If we assume that the radiator is a Blackbody (e=1), and that the steady state temperature of the thing is 90 DegC (it would probably be a bit colder than that, but hey, we're doing "back of the envelope here ;) ), and it is dissipating 100 W of energy into space, then it would need to have a radiating area of

~0.1 m^2
Thanks, nice, but you'd be burning your CPU then. :wink: All the heat is generated on the tiny die of the processor and has to be transported off before it can build up. Meaning that the radiator would have to be kept much cooler than the hot spot so that the heat gradient is steep enough according to the material's thermal conductivity. That isn't as easy to plug into a formula since we don't have a uniform temperature.
Ah ha! You see I cunningly included a "cop-out" clause. "it would probably be a bit colder than that, but hey, we're doing "back of the envelope here." ;)

It gives a nice order of magnitude estimate though. (I wouldn't want to model the thermal conduction process out of the chip and into the radiator, without a more detailed physical model.) :)

In reality there would be also be an additional complexity. I've assumed that the radiator is a black-body. If you know that the radiator is going to catch a glimpse of the sun (a bad thing for a blackbody, but depending on the overall design of your space-based, overclocked, PC, it may happen), you can break the rule that the emissivity and absorbivity are one and the same. This sounds a bit naughty but is just a realisation that the emissivity is really a spectral quantity and is wavelength dependant. The Sun radiates predominantly at the shorter wavelength end of the spectrum, whereas a room temperature blackbody radiates mostly in the vicinity of ~10 microns. What you can do, then, is coat your radiator so that it has an emissivity of one in the region around 10 microns, but has an emissivity of zero (i.e. totally reflective) in the vicinity of the peak in the solar spectrum.

In a way this is vaguely analogous (though only vaguely) to how the greenhouse effect works. The atmosphere is transparent in the visible region around the solar peak, so the earths surface heats up. Unfortunately, atmospheric "greenhouse gases", are fairly opaque in the region that a room temperature earth tends to emit in, so we keep the heat in...

Avatar28
2004-May-27, 04:42 PM
[quote=Avatar28]We use water cooling frequently for industrial computers, particularly when we have an 80C operating temperature requirement. :) We also use heat pipes in some situations.

As for comparing heat output to a nuclear reactor... maybe the article was comparing to an RTG, not a reactor. Otherwise, you'd need a nuclear reactor to power your computer. :)

I use water cooling on my PC at home. :-) Heat pipe heatsinks are also getting pretty common and are even being used in some name brand PCs, especially on laptops.

I agree you'd need a reactor to power your computer if that was it's heat output. But, again, what we're talking about is heat output per square centimeter. I REALLY wish I could find the chart, but because of power leakage as CPU speed scales up, the P4 at the high end now outputs more heat per square centimeter than a nuclear reactor but is still a ways off from rocket nozzles.