
Originally Posted by
tashirosgt
Consider a communication channel which transmits a data stream of bits. Each "message" of N bits is generated according to a stochastic process. The random variable in question has values that are the possible messages and let's say we know the probabilities of each of these values. Then we compute the entropy of that random variable. That would allow us to say something about the information content of a message.
Take N = 6. It is not correct to say that a particular message, like ( 1,0,1,1,0,1) has more or less information than another message like (1,1,1,1,1,1). In fact, suppose that the six bits are chosen according to the layman's idea of "randomly", i.e. suppose that each bit is chosen independently of the others and that 0 and 1 are each chosen with probability 0.5. Knowing the first 5 bits of a message are 1 does not make it more or less probable that the last bit is 1.
Likewise it is not correct to say that the sequence (1,1,1,1,1,1) is more compressible than the sequence (1,0,1,1,0,1) unless you specify a method of compression. For example a person might use a code where "X" is transmitted to represent (1,0,1,1,0,1) and other mesages are represented by transmitting a "Y" to represent each 0 and a "Z" to represent each 1. (The use of acronyms in common speech is a good example of how such conventions arise in a non-mathematical setting.)
To say that a completely black image has zero information conveys an idea but it is not precise. Perhaps what is meant is that if you are dealing with a population of images where where each pixel tends to be the same color as other pixels then the amount of information per pixel in the data stream is less than in a situation where pixels are independent random variables. Or perhaps what is meant is that if we are dealing with a population of images, each of which is all black, the transmitting the image data conveys zero information.
Thermodynamic entropy has a different definition than Shannon entropy. However, is there a way to analyze thermodynamic entropy as a Shannon entropy? I thought statistical physics had found a way to do this.