Copernicus

2018-Jun-21, 11:51 AM

Are pairs of entangled electrons entangled with another pair of entangled electrons? And those 4 electrons entangled with 4 more electrons, and so forth?

View Full Version : entanglement

Copernicus

2018-Jun-21, 11:51 AM

Are pairs of entangled electrons entangled with another pair of entangled electrons? And those 4 electrons entangled with 4 more electrons, and so forth?

Ken G

2018-Jun-22, 12:23 PM

There are about 1057 electrons in a white dwarf, and they are all entangled with each other-- that's what holds the white dwarf up. Contrast that with the 1057 electrons in the Sun-- they are very weakly entangled, to the point of negligible entanglement. So the progress from a Sun to a white dwarf is a story of increasing entanglement (among other things).

George

2018-Jun-22, 01:05 PM

There are about 1057 electrons in a white dwarf, and they are all entangled with each other-- that's what holds the white dwarf up. Contrast that with the 1057 electrons in the Sun-- they are very weakly entangled, to the point of negligible entanglement. So the progress from a Sun to a white dwarf is a story of increasing entanglement (among other things). That's interesting. This hints at an entropy issue for entanglement increase itself. Is this trivial or not (Y/N/maybe so)?

Grey

2018-Jun-22, 02:06 PM

It's also worthwhile noting that there's a certain sense in which any particle is entangled with any other particle that it has ever interacted with (or, really, that it ever will interact with). But generally, when talking about entangled particles, physicists usually mean a specific variety of entanglement, such that the measurement of a specific property of one of those particles (like spin or polarization) is correlated in a specific way with that of the other particle. In that particular case, it's not generally possible to have more than two particles entangled (well, there are some exceptions, but it's not trivial to arrange, and even then isn't quite what you might imagine for multiple particles being entangled), and if the entangled particles interact in a significant way with other particles, the particular correlation is broken.

George

2018-Jun-22, 02:24 PM

It's also worthwhile noting that there's a certain sense in which any particle is entangled with any other particle that it has ever interacted with (or, really, that it ever will interact with). But generally, when talking about entangled particles, physicists usually mean a specific variety of entanglement, such that the measurement of a specific property of one of those particles (like spin or polarization) is correlated in a specific way with that of the other particle. In that particular case, it's not generally possible to have more than two particles entangled (well, there are some exceptions, but it's not trivial to arrange, and even then isn't quite what you might imagine for multiple particles being entangled), and if the entangled particles interact in a significant way with other particles, the particular correlation is broken.

So entanglement comes in "flavors"? Knowing I'm using excessive hyperbole, but would something as simple as my hitting an 8-ball in the corner pocket (ahead of schedule, typically) be the first type of entanglement interaction? Would the other flavor be a reaction to its entangled partner's behavior, unrestrained with the speed of light?

So entanglement comes in "flavors"? Knowing I'm using excessive hyperbole, but would something as simple as my hitting an 8-ball in the corner pocket (ahead of schedule, typically) be the first type of entanglement interaction? Would the other flavor be a reaction to its entangled partner's behavior, unrestrained with the speed of light?

Shaula

2018-Jun-22, 04:21 PM

IIn that particular case, it's not generally possible to have more than two particles entangled (well, there are some exceptions, but it's not trivial to arrange, and even then isn't quite what you might imagine for multiple particles being entangled), and if the entangled particles interact in a significant way with other particles, the particular correlation is broken.

The record is, I think, ten pairs of photons entangled. And that took a lot of experimental finesse to arrange.

The record is, I think, ten pairs of photons entangled. And that took a lot of experimental finesse to arrange.

Copernicus

2018-Jun-22, 04:42 PM

There are about 1057 electrons in a white dwarf, and they are all entangled with each other-- that's what holds the white dwarf up. Contrast that with the 1057 electrons in the Sun-- they are very weakly entangled, to the point of negligible entanglement. So the progress from a Sun to a white dwarf is a story of increasing entanglement (among other things).

Does a white dwarf have more or less entropy than a sun like ours. Is there an approximate value of entropy for both systems?

Does a white dwarf have more or less entropy than a sun like ours. Is there an approximate value of entropy for both systems?

Ken G

2018-Jun-23, 05:31 PM

Does a white dwarf have more or less entropy than a sun like ours.Much much less.

Is there an approximate value of entropy for both systems?Eventually (after infinite time really) the white dwarf becomes a black dwarf, which is in its ground state, so has an entropy of zero (I'm neglecting the entropy of the rising gravity, that should be included in there somewhere but isn't of issue to the question here). Since this happens spontaneously, it means the entropy of deep space rises even more, due to the starlight. Changes in entropy are the heat lost divided by the prevailing temperature, so heat exchange from a star to the CMB will always result in entropy rise. This is also why heat goes from hot to cool and not the other way around.

Is there an approximate value of entropy for both systems?Eventually (after infinite time really) the white dwarf becomes a black dwarf, which is in its ground state, so has an entropy of zero (I'm neglecting the entropy of the rising gravity, that should be included in there somewhere but isn't of issue to the question here). Since this happens spontaneously, it means the entropy of deep space rises even more, due to the starlight. Changes in entropy are the heat lost divided by the prevailing temperature, so heat exchange from a star to the CMB will always result in entropy rise. This is also why heat goes from hot to cool and not the other way around.

Ken G

2018-Jun-23, 05:59 PM

That's interesting. This hints at an entropy issue for entanglement increase itself. Is this trivial or not (Y/N/maybe so)?Not trivial, entropy is an excellent entry for thinking about entanglement and the difference between quantum and classical systems. It gets complicated quickly, but the basic point is that classical systems should exhibit additivity of their entropy-- if you have system A, and its entropy S(A), and system B made of separate parts, and its entropy S(B), classical thinking gives us that the entropy of the combined system is S(A+B)=S(A)+S(B). Basically, if you need to answer nA yes/no questions to completely characterize system A, and nB for B, then classically you should need the sum of that many questions to completely characterize the combined system. This is true even if the two systems have a history of mutual interaction. But this is not the case in quantum mechanics, whenever there is entanglement the entropy of the combined system is less than the sum of entropies of its parts. The whole is less than the sum of its parts, if you will. This is because quantum systems really aren't made of parts, they are holistic systems that can only be regarded as comprised of separate parts if they are not entangled.

This is true regardless of the number of particles, but it is certainly an easier concept when there are two particles. For example, the ground state of a hydrogen atom cannot be said to be comprised of a proton and an electron because of the implication that the proton and electron would then have their own states. In particular, it would be wrong to say you can make a ground-state hydrogen atom by taking an electron with spin up, and binding it to a proton of spin down, the system I just described would not be the ground state of hydrogen. Instead, the spin state of both particles is indeterminate, and the system is only in a definite state when you consider the proton and electron together as a single system. So we can still say there is a proton and electron "in there somewhere," but I would not say the ground state hydrogen is "comprised of" a proton and an electron, because to me, that language implies each of those particles have a state of their own, and the whole is a kind of combination of those two particle states. That's why the ground state is an entangled state, because it has zero entropy, but the proton and electron that are "in there" each have a state of nonzero entropy because they are indeterminate spin states. So you must choose your poison-- either preserve the idea that you can combine parts into a whole, but lose that you can add the entropy of the parts, or maintain additivity of entropy whenever parts are combined, but accept that you cannot regard the ground state hydrogen atom as a combination of parts. Which do you like more, parts or adding entropies? Must choose.

This is true regardless of the number of particles, but it is certainly an easier concept when there are two particles. For example, the ground state of a hydrogen atom cannot be said to be comprised of a proton and an electron because of the implication that the proton and electron would then have their own states. In particular, it would be wrong to say you can make a ground-state hydrogen atom by taking an electron with spin up, and binding it to a proton of spin down, the system I just described would not be the ground state of hydrogen. Instead, the spin state of both particles is indeterminate, and the system is only in a definite state when you consider the proton and electron together as a single system. So we can still say there is a proton and electron "in there somewhere," but I would not say the ground state hydrogen is "comprised of" a proton and an electron, because to me, that language implies each of those particles have a state of their own, and the whole is a kind of combination of those two particle states. That's why the ground state is an entangled state, because it has zero entropy, but the proton and electron that are "in there" each have a state of nonzero entropy because they are indeterminate spin states. So you must choose your poison-- either preserve the idea that you can combine parts into a whole, but lose that you can add the entropy of the parts, or maintain additivity of entropy whenever parts are combined, but accept that you cannot regard the ground state hydrogen atom as a combination of parts. Which do you like more, parts or adding entropies? Must choose.

Copernicus

2018-Jun-24, 06:37 PM

Is entropy relative. Can one be in a system and it looks like entropy is ginormous and then someone look at a contained system and see that system has zero entropy.

Ken G

2018-Jun-24, 09:55 PM

Is entropy relative. Can one be in a system and it looks like entropy is ginormous and then someone look at a contained system and see that system has zero entropy.

Yes, and a perfect example is 100 coins that have just been flipped but are under a sheet. The entropy is k times the natural log of 2100, since you would need to ask 100 yes/no questions to know the exact configuration of all the coins. Or just lift the sheet-- and see what the configuration was "all along." Was it that way before you lifted the sheet, or did lifting the sheet resolve the entropy to being zero? The distinction is unnecessary to make in physics, where we always have to deal with what we know, never "what is."

So entropy is relative when it is framed in terms of the information of the person using it. In classical systems, we can say that every system is in exactly one state, so all systems have zero entropy in some kind of "absolute" sense. But when we do thermodynamics, we choose to lump together huge (and I do mean huge) classes of systems into a singe bin, and we say that we are not interested in any of the differences within that bin, only differences with other bins and whether or not our system is in one bin or another. Then we are accepting that we are missing a vast amount of information, and the entropy is related to how much missing information is in the "bin" that our system is in. So entropy is very much a tool of the user. However, once you decide how you are making your bins and what information you are choosing to not care about, then the entropy becomes absolute-- it is absolute to the set of choices you are making. in particular, a state of zero entropy is a "bin" with only one state in it, and so the "zero" of the entropy scale is also absolute. What's more, it takes an infinite time (or an infinite number of events) for a system to evolve with certainty into such a bin of just one state, so it is thus impossible to do an exact measurement.

This carries over into quantum mechanics, except there it is less clear that every system is "actually" in a pure state of zero entropy. There is such a thing as a "mixed" state, and it is unclear if a mixed state is what it is purely because of missing information on our parts, or if it is possible that a system "really is" in a mixed state of nonzero entropy. But either way, we never actually do physics on the states that "really are," we do physics on our information, so the distinction does not have any ramifications and we end up doing quantum statistical mechanics the same way we do it classically-- we create bins of information that we are not distinguishing, which involve the probabilities of the various measurable outcomes. We don't care to specify the exact state of our measuring device, only that it has registered a given outcome. If one wanted to track the increase in entropy in all interactions, one would need to account for the entropy of the measuring device, which would not be fun. Entropy is not always an easy concept to use, but at some level, we could safely say that everything that happens happens because it increases entropy, that is, it moves systems from smaller bins of possible states to larger boxes that are therefore more likely. You can say everything that happens is thus random, or you could say that everything happens for a reason if the reason is "because it was the larger box."

Yes, and a perfect example is 100 coins that have just been flipped but are under a sheet. The entropy is k times the natural log of 2100, since you would need to ask 100 yes/no questions to know the exact configuration of all the coins. Or just lift the sheet-- and see what the configuration was "all along." Was it that way before you lifted the sheet, or did lifting the sheet resolve the entropy to being zero? The distinction is unnecessary to make in physics, where we always have to deal with what we know, never "what is."

So entropy is relative when it is framed in terms of the information of the person using it. In classical systems, we can say that every system is in exactly one state, so all systems have zero entropy in some kind of "absolute" sense. But when we do thermodynamics, we choose to lump together huge (and I do mean huge) classes of systems into a singe bin, and we say that we are not interested in any of the differences within that bin, only differences with other bins and whether or not our system is in one bin or another. Then we are accepting that we are missing a vast amount of information, and the entropy is related to how much missing information is in the "bin" that our system is in. So entropy is very much a tool of the user. However, once you decide how you are making your bins and what information you are choosing to not care about, then the entropy becomes absolute-- it is absolute to the set of choices you are making. in particular, a state of zero entropy is a "bin" with only one state in it, and so the "zero" of the entropy scale is also absolute. What's more, it takes an infinite time (or an infinite number of events) for a system to evolve with certainty into such a bin of just one state, so it is thus impossible to do an exact measurement.

This carries over into quantum mechanics, except there it is less clear that every system is "actually" in a pure state of zero entropy. There is such a thing as a "mixed" state, and it is unclear if a mixed state is what it is purely because of missing information on our parts, or if it is possible that a system "really is" in a mixed state of nonzero entropy. But either way, we never actually do physics on the states that "really are," we do physics on our information, so the distinction does not have any ramifications and we end up doing quantum statistical mechanics the same way we do it classically-- we create bins of information that we are not distinguishing, which involve the probabilities of the various measurable outcomes. We don't care to specify the exact state of our measuring device, only that it has registered a given outcome. If one wanted to track the increase in entropy in all interactions, one would need to account for the entropy of the measuring device, which would not be fun. Entropy is not always an easy concept to use, but at some level, we could safely say that everything that happens happens because it increases entropy, that is, it moves systems from smaller bins of possible states to larger boxes that are therefore more likely. You can say everything that happens is thus random, or you could say that everything happens for a reason if the reason is "because it was the larger box."

Copernicus

2018-Jun-25, 09:45 PM

Yes, and a perfect example is 100 coins that have just been flipped but are under a sheet. The entropy is k times the natural log of 2100, since you would need to ask 100 yes/no questions to know the exact configuration of all the coins. Or just lift the sheet-- and see what the configuration was "all along." Was it that way before you lifted the sheet, or did lifting the sheet resolve the entropy to being zero? The distinction is unnecessary to make in physics, where we always have to deal with what we know, never "what is."

So entropy is relative when it is framed in terms of the information of the person using it. In classical systems, we can say that every system is in exactly one state, so all systems have zero entropy in some kind of "absolute" sense. But when we do thermodynamics, we choose to lump together huge (and I do mean huge) classes of systems into a singe bin, and we say that we are not interested in any of the differences within that bin, only differences with other bins and whether or not our system is in one bin or another. Then we are accepting that we are missing a vast amount of information, and the entropy is related to how much missing information is in the "bin" that our system is in. So entropy is very much a tool of the user. However, once you decide how you are making your bins and what information you are choosing to not care about, then the entropy becomes absolute-- it is absolute to the set of choices you are making. in particular, a state of zero entropy is a "bin" with only one state in it, and so the "zero" of the entropy scale is also absolute. What's more, it takes an infinite time (or an infinite number of events) for a system to evolve with certainty into such a bin of just one state, so it is thus impossible to do an exact measurement.

This carries over into quantum mechanics, except there it is less clear that every system is "actually" in a pure state of zero entropy. There is such a thing as a "mixed" state, and it is unclear if a mixed state is what it is purely because of missing information on our parts, or if it is possible that a system "really is" in a mixed state of nonzero entropy. But either way, we never actually do physics on the states that "really are," we do physics on our information, so the distinction does not have any ramifications and we end up doing quantum statistical mechanics the same way we do it classically-- we create bins of information that we are not distinguishing, which involve the probabilities of the various measurable outcomes. We don't care to specify the exact state of our measuring device, only that it has registered a given outcome. If one wanted to track the increase in entropy in all interactions, one would need to account for the entropy of the measuring device, which would not be fun. Entropy is not always an easy concept to use, but at some level, we could safely say that everything that happens happens because it increases entropy, that is, it moves systems from smaller bins of possible states to larger boxes that are therefore more likely. You can say everything that happens is thus random, or you could say that everything happens for a reason if the reason is "because it was the larger box."

Thanks KenG.

Are there stability or instability nodes of minimum or maximum entanglement and/or entropy?

So entropy is relative when it is framed in terms of the information of the person using it. In classical systems, we can say that every system is in exactly one state, so all systems have zero entropy in some kind of "absolute" sense. But when we do thermodynamics, we choose to lump together huge (and I do mean huge) classes of systems into a singe bin, and we say that we are not interested in any of the differences within that bin, only differences with other bins and whether or not our system is in one bin or another. Then we are accepting that we are missing a vast amount of information, and the entropy is related to how much missing information is in the "bin" that our system is in. So entropy is very much a tool of the user. However, once you decide how you are making your bins and what information you are choosing to not care about, then the entropy becomes absolute-- it is absolute to the set of choices you are making. in particular, a state of zero entropy is a "bin" with only one state in it, and so the "zero" of the entropy scale is also absolute. What's more, it takes an infinite time (or an infinite number of events) for a system to evolve with certainty into such a bin of just one state, so it is thus impossible to do an exact measurement.

This carries over into quantum mechanics, except there it is less clear that every system is "actually" in a pure state of zero entropy. There is such a thing as a "mixed" state, and it is unclear if a mixed state is what it is purely because of missing information on our parts, or if it is possible that a system "really is" in a mixed state of nonzero entropy. But either way, we never actually do physics on the states that "really are," we do physics on our information, so the distinction does not have any ramifications and we end up doing quantum statistical mechanics the same way we do it classically-- we create bins of information that we are not distinguishing, which involve the probabilities of the various measurable outcomes. We don't care to specify the exact state of our measuring device, only that it has registered a given outcome. If one wanted to track the increase in entropy in all interactions, one would need to account for the entropy of the measuring device, which would not be fun. Entropy is not always an easy concept to use, but at some level, we could safely say that everything that happens happens because it increases entropy, that is, it moves systems from smaller bins of possible states to larger boxes that are therefore more likely. You can say everything that happens is thus random, or you could say that everything happens for a reason if the reason is "because it was the larger box."

Thanks KenG.

Are there stability or instability nodes of minimum or maximum entanglement and/or entropy?

Ken G

2018-Jun-26, 08:18 PM

Are there stability or instability nodes of minimum or maximum entanglement and/or entropy?Maximum entanglement is minimum entropy, so I'm not sure what you mean by nodes but a white dwarf is a great example of a huge and complicated system that is approaching maximum entanglement and minimum entropy. It is kind of like a giant molecule, and molecules also reach zero entropy as they reach their ground state. It's stable in the sense that a ground-state system will not leave its ground state without interacting with something else, but it's unstable in the sense that any interaction will always increase the entropy and give the system some chance of leaving its ground state.

George

2018-Jun-26, 09:27 PM

Thanks for all your explanation. I wouldn't have thought entanglement and white dwarfs could bring greater clarification to entropy, not that I'm fully clarified here, but I do plan to read this thread more slowly. :)

Maximum entanglement is minimum entropy, so I'm not sure what you mean by nodes but a white dwarf is a great example of a huge and complicated system that is approaching maximum entanglement and minimum entropy. It is kind of like a giant molecule, and molecules also reach zero entropy as they reach their ground state. It's stable in the sense that a ground-state system will not leave its ground state without interacting with something else, but it's unstable in the sense that any interaction will always increase the entropy and give the system some chance of leaving its ground state.

It is ironic to me that a star gets old, puffs profusely, exposes its dead and extremely hot core only to reveal that it has very low entropy. I feel like I need to place the 2nd law on its surface and stomp on it a while to squeeze it in there somehow. :) Shouldn't the first guess go to a high entropy dead star? I'm getting older and I feel more and more entangled but how can I get my entropy down a little lower?

Maximum entanglement is minimum entropy, so I'm not sure what you mean by nodes but a white dwarf is a great example of a huge and complicated system that is approaching maximum entanglement and minimum entropy. It is kind of like a giant molecule, and molecules also reach zero entropy as they reach their ground state. It's stable in the sense that a ground-state system will not leave its ground state without interacting with something else, but it's unstable in the sense that any interaction will always increase the entropy and give the system some chance of leaving its ground state.

It is ironic to me that a star gets old, puffs profusely, exposes its dead and extremely hot core only to reveal that it has very low entropy. I feel like I need to place the 2nd law on its surface and stomp on it a while to squeeze it in there somehow. :) Shouldn't the first guess go to a high entropy dead star? I'm getting older and I feel more and more entangled but how can I get my entropy down a little lower?

Ken G

2018-Jun-27, 03:11 AM

T

It is ironic to me that a star gets old, puffs profusely, exposes its dead and extremely hot core only to reveal that it has very low entropy.The key thing is that to reach that state, it has lose lots and lots of heat. That's the general way to make entropy drop-- lose heat.

I feel like I need to place the 2nd law on its surface and stomp on it a while to squeeze it in there somehow.It's straightforward-- entropy change is heat transferred divided by temperature, so the thing losing the heat (the star) loses entropy, the thing gaining the heat (the cosmic background radiation) gains entropy, but since the heat comes from the high T and goes to the low T, that always ends up obeying the second law.

I'm getting older and I feel more and more entangled but how can I get my entropy down a little lower?Don't eat and go somewhere cold with no coat on!

It is ironic to me that a star gets old, puffs profusely, exposes its dead and extremely hot core only to reveal that it has very low entropy.The key thing is that to reach that state, it has lose lots and lots of heat. That's the general way to make entropy drop-- lose heat.

I feel like I need to place the 2nd law on its surface and stomp on it a while to squeeze it in there somehow.It's straightforward-- entropy change is heat transferred divided by temperature, so the thing losing the heat (the star) loses entropy, the thing gaining the heat (the cosmic background radiation) gains entropy, but since the heat comes from the high T and goes to the low T, that always ends up obeying the second law.

I'm getting older and I feel more and more entangled but how can I get my entropy down a little lower?Don't eat and go somewhere cold with no coat on!

George

2018-Jun-27, 01:49 PM

The key thing is that to reach that state, it has lose lots and lots of heat. That's the general way to make entropy drop-- lose heat.

It's straightforward-- entropy change is heat transferred divided by temperature, so the thing losing the heat (the star) loses entropy, the thing gaining the heat (the cosmic background radiation) gains entropy, but since the heat comes from the high T and goes to the low T, that always ends up obeying the second law. Yes, it does make sense that a great deal of Q is blown off and the ratio of (delta Q)/T is very likely reduced as well. It's a little hard to simply take a 150,000 K object the size of Earth and think of it as having low entropy, but then, it was once 15M kelvins, after all, I suppose. Your point about it dumping its heat into the ultimate heat sink (space) is very helpful because it is how to see entropy reduction for a refrigerator.

Don't eat and go somewhere cold with no coat on! Well, you've caught my dichotomy -- the only cold place is in front of my open refrigerator. ;)

It's straightforward-- entropy change is heat transferred divided by temperature, so the thing losing the heat (the star) loses entropy, the thing gaining the heat (the cosmic background radiation) gains entropy, but since the heat comes from the high T and goes to the low T, that always ends up obeying the second law. Yes, it does make sense that a great deal of Q is blown off and the ratio of (delta Q)/T is very likely reduced as well. It's a little hard to simply take a 150,000 K object the size of Earth and think of it as having low entropy, but then, it was once 15M kelvins, after all, I suppose. Your point about it dumping its heat into the ultimate heat sink (space) is very helpful because it is how to see entropy reduction for a refrigerator.

Don't eat and go somewhere cold with no coat on! Well, you've caught my dichotomy -- the only cold place is in front of my open refrigerator. ;)

Ken G

2018-Jun-28, 03:11 AM

Well, you've caught my dichotomy -- the only cold place is in front of my open refrigerator. ;)Then I guess you better make sure the refrigerator is empty so you won't eat anything. Beware though-- the way to lose entropy the fastest is to die!

George

2018-Jun-28, 05:07 PM

Then I guess you better make sure the refrigerator is empty so you won't eat anything. Beware though-- the way to lose entropy the fastest is to die!

Ah, great point! I will now accelerate my entropy rate increase, and just in time for a hot lunch!

Ah, great point! I will now accelerate my entropy rate increase, and just in time for a hot lunch!

Powered by vBulletin® Version 4.2.3 Copyright © 2019 vBulletin Solutions, Inc. All rights reserved.