Clarification : S = K ln W and S = K ln omega

  • B
  • Thread starter morrobay
  • Start date
  • Tags
    Ln Omega
In summary, the equations S = K ln W and S = K ln Ω both represent the entropy of a system, but they are derived from different approaches. The equation S = K ln W applies to non-equilibrium systems and the equation S = K ln Ω applies to equilibrium systems. While both W and Ω represent the multiplicity of the system, they are not interchangeable and have different meanings in their respective equations."
  • #1
morrobay
Gold Member
1,035
1,276
With S = K ln W where W is probability system is in the state it is in relative to all other possible states :
W = VN , V = volume, N = number of particles so ln W = N (ln V)
And this expression is for non equilibrium state.
For equilibrium state S = K ln Ω Then is the only difference between W and Ω that Ω is for maximum entropy ?
If so it appears it would have same value as W
 
Last edited:
Physics news on Phys.org
  • #2
As far as a know, there is no difference in that equation between W and Ω; it is simply a different choice of symbol.

W or Ω represent the multiplicity of the corresponding macrostate. It is not a probability. (It couldn't be, because a probability ≤ 1, so S would be negative or 0, but entropy is always a positive quantity.)

Calculating entropy for a non-equilibrium system is not a trivial thing.
 
  • #3
But the definition is still valid, even for a non-equilibrium state?
 
  • #4
Chandra Prayaga said:
But the definition is still valid, even for a non-equilibrium state?
From https://en.wikipedia.org/wiki/Non-equilibrium_thermodynamics
Wikipedia said:
Another fundamental and very important difference is the difficulty or impossibility, in general, in defining entropy at an instant of time in macroscopic terms for systems not in thermodynamic equilibrium; it can be done, to useful approximation, only in carefully chosen special cases, namely those that are throughout in local thermodynamic equilibrium.[1][2]

See also https://physics.stackexchange.com/questions/134377/definition-of-entropy-in-nonequilibrium-states
 
  • Like
Likes Kashmir
  • #7
My confusion here is that omega is used to derive S = k ln Ω :
1) S = k ln W where W = (V)N microstates
This leads to the Gibbs statistical entropy formula for non uniform probability distribution, non equilibrium systems :
2) S = - k ∑ pi ln pi
Then in special case of uniform distribution pi = 1/Ω
3) S = -k Σ 1/Ω ln 1/Ω = k ln Ω
For maximum entropy equilibrium systems
It would help to resolve omega and W if pi = 1/W for uniform distribution
 
Last edited:
  • #8
Does S = K ln Ω only hold in equilibrium?
 
  • #9
From link in post #5
Screenshot_2022-03-24-19-38-36-65.jpg
 
  • Like
Likes Kashmir

Related to Clarification : S = K ln W and S = K ln omega

What is the meaning of the equation S = K ln W and S = K ln omega?

The equation S = K ln W and S = K ln omega is known as the Boltzmann entropy formula, which is used in statistical mechanics to describe the relationship between entropy (S), temperature (K), and the number of microstates (W or omega) of a system.

What is entropy and why is it important in science?

Entropy is a measure of the disorder or randomness of a system. It is important in science because it helps us understand the behavior of physical, chemical, and biological systems. It also plays a crucial role in thermodynamics, information theory, and statistical mechanics.

How is the Boltzmann entropy formula derived?

The Boltzmann entropy formula is derived from the second law of thermodynamics, which states that the total entropy of a closed system will never decrease over time. It was first derived by Ludwig Boltzmann in the late 19th century and has since been used to explain many physical phenomena.

What are the units of the constants K and W (or omega) in the Boltzmann entropy formula?

The constant K has units of energy divided by temperature (such as J/K or eV/K). The number of microstates (W or omega) is a dimensionless quantity, so it has no units.

How is the Boltzmann entropy formula used in practical applications?

The Boltzmann entropy formula is used in a wide range of practical applications, including thermodynamics, statistical mechanics, information theory, and even in the field of artificial intelligence. It is used to calculate the entropy of a system and can help predict the behavior of complex systems.

Similar threads

Replies
22
Views
2K
Replies
2
Views
1K
Replies
1
Views
3K
  • Classical Physics
Replies
4
Views
737
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Thermodynamics
Replies
4
Views
2K
  • Advanced Physics Homework Help
Replies
7
Views
2K
Replies
3
Views
1K
  • Introductory Physics Homework Help
Replies
12
Views
779
Back
Top