Boltzmann's version of 2nd Law -log relationship

In summary, Boltzmann defined entropy as the logarithm of the number of microstates in a system, represented by the constant "k". This relationship was later extended to multiple systems by taking the logarithm of the product of their microstates. This led to the equation E=Hf, which was derived by Planck through his reasoning processes.
  • #1
celal777
11
0
Hello List,

I understand Boltzmann defined entropy as log W where W is the number of microstates in the system time a constant "k" ; hence S= k Log W.

Can someone explain to me where the logarithmic relationship comes from please ?

Many thanks in advance,

Celal Berker
London, England
 
Physics news on Phys.org
  • #2
You want the Entropy to be a quantum that adds up when you're looking at more than one system: E. g. if you have two containers filled with gas, you want the relationship S(container A & container B) = S(container A) + S(container B).

But if the system A has [itex]W_a[/itex] states and the system B has [itex]W_b[/itex] states, the combined system has [itex]W_a W_b[/itex] states: a product, not a sum. So youo have to take the logarithm of this number to get the relation above.
 
  • #3
Many Thanks Bruno (is that your name?)

So how did Planck, by what thought or reasoning processes did he go from the relation S= k log W to E=Hf ?

--Celal
 

Related to Boltzmann's version of 2nd Law -log relationship

1. What is Boltzmann's version of the Second Law of Thermodynamics?

Boltzmann's version of the Second Law states that the entropy of a closed system will tend to increase over time. This means that in a closed system, the disorder or randomness of the system will naturally increase over time.

2. How does Boltzmann's version of the Second Law relate to the concept of entropy?

Boltzmann's version of the Second Law can be mathematically expressed as S = k ln(W), where S is the entropy of a system, k is the Boltzmann constant, and W is the number of microstates (possible arrangements of particles) in the system. This relationship shows that as the number of microstates increases, so does the entropy.

3. What is the significance of the negative sign in Boltzmann's version of the Second Law?

The negative sign in the equation S = k ln(W) indicates that entropy will always increase over time. This is because ln(W) will always be a positive value, and multiplying it by the negative Boltzmann constant results in a negative value for entropy. This shows that the natural tendency of a closed system is to increase in disorder.

4. How does Boltzmann's version of the Second Law differ from the traditional version?

The traditional version of the Second Law states that the total entropy of a closed system will not decrease over time. However, Boltzmann's version is more specific and provides a mathematical relationship between entropy and the number of microstates in a system. It also explains the concept of entropy in terms of the disorder and randomness of a system.

5. Can Boltzmann's version of the Second Law be applied to all systems?

Boltzmann's version of the Second Law applies specifically to closed systems, where there is no exchange of matter or energy with the surroundings. It is a fundamental law of thermodynamics and is applicable to a wide range of systems, from microscopic particles to large-scale systems like the universe.

Similar threads

  • Thermodynamics
Replies
1
Views
758
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Thermodynamics
Replies
2
Views
807
Replies
2
Views
4K
Replies
3
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
858
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Classical Physics
4
Replies
131
Views
4K
  • Thermodynamics
Replies
1
Views
2K
Replies
19
Views
2K
Back
Top