PDA

View Full Version : Entropy...



Durakken
2009-Jun-12, 05:42 AM
I don't fully get entropy, but from what I understand the gist is that Energy will eventually come to a stable spread which means that over time while energy isn't disappearing it is simply stablizing? or something to that effect?

So once all the energy in the universe is stable it will be impossible to use the energy or something like that...

i don't really care about that part, but that and my last thread "the end of the universe" and the membrane theory gave me the thought...

What if as the universe is expanding we are running into other membranes which imparts energy to our universe. What would happen then? Is there a way to disprove this? Could this not be happening outside our visible range?

PraedSt
2009-Jun-12, 10:36 AM
What if as the universe is expanding we are running into other membranes which imparts energy to our universe. What would happen then? Is there a way to disprove this? Could this not be happening outside our visible range?
If you regard our universe as an open system, then its entropy can be lowered through contact with another universe. Of course, the entropy of the other universe would increase by a larger amount, so that the sum of the two would increase. Er...assuming our rules hold over there. :)

What would happen then, depends on what's being transferred. What sort of energy?

Is there a way to disprove this? I have no idea. I presume that someone should make a falsifiable prediction, then we can see.

Could this not be happening outside our visible range? I have no idea, sorry. Might be, might not be.

GOURDHEAD
2009-Jun-12, 12:45 PM
If I remember correctly, the precise definition of entropy limited its application to a closed finite system. It is not clear that the universe, however many mutually exclusive domains it may include, is such a system. As currently perceived, the universe is expanding at an accelerating rate, but we can't be sure this is a permanent condition. The force of gravity, to a limited extent, opposes the thermodynamic forces that grow during proto-stellar cloud collapse. These thermodynamic forces, along with the conservation of angular momentum, tend to propel the constituents of the proto-stellar cloud away from the collapsing center. Super novae are culminating examples of the thermodynamic dispersion.

In at least one case, the flow of energy caused by these opposing forces has generated a biosphere which has produced technically competent critters who are driven to live and have their being. If these or other like driven critters learn to manage the forces of the universe to the extent of unravelling black holes and configuring galactic groups to their liking, they may achieve "managed entropy" (Kardashev type VI civilization?). Think in terms of an extrapolation from Asimov's Foundation series of novels.

Ken G
2009-Jun-12, 04:00 PM
Entropy is actually a much easier concept than you might think. The way it works is, first we take a system that can have many (many many) different configurations, and we start grouping those configurations together into subsets that we are going to treat as having no need to distinguish. By that I mean, if the differences don't matter to us for some particular application, we will call them not distinguished. (An example is, the location of all the molecules in the air around you-- you don't care where they are in detail, you only care that some be inside your lungs and some be outside, and you don't care how fast each one is moving, only that they not be moving so fast on average that you feel hot, or so slow that you feel cold.) Note that we always distinguish configurations that have different energy, because we find it very useful to track what energy is doing.

So, where then does entropy come in? Well, once we've decided what configurations we are not distinguishing, we have general classes of configurations. We need to characterize their energy, because we may know how much total energy the system should have, and then we can limit our thinking to the groups whose member configurations have that energy. Now comes the easy part-- which group of configurations will the system actually exhibit? Most likely, it will exhibit whichever group contains the largest number of possible configurations. That's it, that's entropy-- entropy means counting the number of configurations in each group (and taking the natural log and mulitplying by the Boltzmann constant k, if you want the details). Then the only "law" you need to say that entropy will increase is that systems will always evolve from less likely groups (i.e., groups with fewer "ways to happen", fewer configurations in the group) to more likely groups, and the more likely groups contain more configurations (more "members") so have a higher entropy.

Put like that, the second law of thermodynamics is quite trivial: it is the statement that more likely things happen and less likely things don't. The last piece to put in place is that we are talking about groups of spectacularly many possible configurations, so the most likely thing pretty much always happens. It's like if you went to Las Vegas, and placed one bet on the roulette wheel, you might win or you might lose and no one can predict it. But if you stay for a year and play roulette constantly, your staggering losses will be highly predictable, with very little uncertainty.

So the upshot of all this is, entropy can be defined for an open system, it's just that you don't know it has to increase for an open system, because it might be in contact with some other system which has so many ways to happen that it is more likely for the entropy of one system to drop if the other can rise. The meaning of "spontaneous reactions" is that the total entropy has to rise, but it's a good thing for living beings that it is not necessary for the entropy of every subsystem to rise. We can actualize the very unlikely processes that life needs to happen if we simply connect ourselves to other systems that are very likely to find new configurations that allow our system to have the configuration we want-- that's pretty much how all "complex macroscopic" systems work.

mugaliens
2009-Jun-12, 06:14 PM
Put simply, entropy refers to the tendency for any closed system to move from a state of order to disorder (randomness). This applies to all aspects of the system, whether it be physical order, differences in energies, chemical distributions, and extends all the way down to the elements themselves.

George
2009-Jun-12, 10:40 PM
It's like if you went to Las Vegas, and placed one bet on the roulette wheel, you might win or you might lose and no one can predict it. But if you stay for a year and play roulette constantly, your staggering losses will be highly predictable, with very little uncertainty. Is it true that Vegas does little to encourage conferences in physics? :) [I have actually heard this, somewhere.]

mugaliens
2009-Jun-12, 10:56 PM
It's not only true, but the city has outlawed the unlicensed practice of statistics in placed both public and casino!

Seriously, they reall don't care. They do bend over backwards to host things like COMDEX as those large conventions means tons of money in the casino's pockets. I'm sure if there were an international physics convention, large or even quite small (dozen people), many casinos would love to help your ground find the best possible accomodations for your event.