Statistical Entropy: How Can Zero Entropy be in View of the Second Law?

In summary, the concept of entropy is often misunderstood when it comes to knowledge and certainty. Knowing the states of a system does not decrease entropy, as it is related to the number of possible microstates available to the system. Even if we have perfect knowledge of a system, its entropy remains the same. This is demonstrated in the example of compressing gas in a container – while the volume decreases and entropy decreases, knowing the exact location of each molecule does not change the number of possible configurations. Therefore, the question of how a system exists cannot be predicted solely based on its properties.
  • #1
touqra
287
0
Supposing I have a face-down card containing one of the 26 english alphabet, but I don't know which one. Hence, the entropy is kln 26.
But, if I were to open the card, now I know exactly what the alphabet is. Hence, the entropy now is zero.
How can this be in view of the second law?
 
Science news on Phys.org
  • #2
What kind of opening and closing can you see in the components of a system which can bring uncertianity and certianity in the observer?
The entropy you are talking depends on the defintion. You cannot talk that way. If you are talking about probability then that is 1/26.
 
  • #3
Isn't entropy used in context of a system? I would say that the entropy is still the same because there are still 26 states in the system. Turning a card up doesn't do anything.
 
  • #4
What does having a card face-down on a table have to do with entropy?
 
  • #5
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
 
Last edited:
  • #6
Entropy has nothing to do with your knowlegde of the system. W is the number of microstates (under the constraints) available to the system. Your knowlegde doesn't change that number, it's totally unrelated.
 
  • #7
touqra said:
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
You are using the wrong interpretation of a microstate to the card analogy.

Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
No, it does not. Your knowing which microstate the system is in (which would require your measuring all the positions and conjugate momenta simultaneously) does not reduce the phase space of states the system can sample at the given temperature. And it is this number that's relevant to the entropy.
 
  • #8
Gokul43201 said:
You are using the wrong interpretation of a microstate to the card analogy.

I used the wrong interpretation of microstates? How?
 
  • #9
You are asuming that the probability of knowing the state of an object or more properly of a system's properties and probability of their correctness to be entropy. But it isn't. Knowing the states of a system does not deal anywhere with entropy. ntropy of n object to be near to the calculated value is less, then that implies the object has high entropy and vice versa knowin much perfectly the state of a system makes the entropy very less. But even if we know the how is a system existing or its properties, still its entropy doesn't decreases, it remains as it is. The physical meaning of this is that the theory cannot predict perfectly how should that system exist from its properties like T, P, V, etc. But the probability of the ways of its existence doesn't decrease just coz you know it.
Example - If you fill some gas in a container. Further by compressing thevolume of container and increasin pressure, you are decreasing the entropy. There is less volume but the number of molecules has remained the same. So the configuration og their arrangement is less. But in other words, if you know the exact mapping of where and where are the molecules of gas in the container, that does not change the number of configuration in which gas can be filled in the container.
 

Related to Statistical Entropy: How Can Zero Entropy be in View of the Second Law?

1. What is statistical entropy?

Statistical entropy is a measure of the disorder or randomness in a system. It is commonly used in physics and information theory to quantify the amount of uncertainty or lack of information in a system.

2. How is entropy related to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of a closed system will always increase over time, or at best remain constant. This is because energy tends to disperse and become more evenly distributed, resulting in an increase in disorder or entropy.

3. How can there be zero entropy if the Second Law states that it will always increase?

While the Second Law does state that the total entropy of a closed system will increase over time, it does not mean that individual parts of the system cannot have zero entropy. In fact, there are certain configurations of particles that can result in zero entropy, such as when all particles are in the same state or location.

4. Can entropy ever decrease in a system?

No, the Second Law of Thermodynamics dictates that the total entropy of a closed system will either increase or remain constant. Entropy can decrease in a local area or subsystem, but this will always be accompanied by an increase in entropy elsewhere in the system.

5. How is statistical entropy different from thermodynamic entropy?

Thermodynamic entropy is a macroscopic measure that is related to the amount of heat that is transferred in a system, while statistical entropy is a microscopic measure that takes into account the number of possible microstates in a system. However, both concepts are ultimately related and can be used to describe the overall disorder or randomness in a system.

Similar threads

Replies
12
Views
2K
  • Thermodynamics
Replies
2
Views
823
Replies
13
Views
2K
Replies
16
Views
916
Replies
2
Views
875
  • Thermodynamics
Replies
1
Views
2K
Replies
22
Views
2K
Replies
1
Views
1K
Replies
1
Views
811
Back
Top