Understanding Entropy: Calculations and Examples for Beginners

  • Thread starter infinitylord
  • Start date
  • Tags
    Entropy
In summary: Essentially, entropy is a measure of the amount of disorder in a system. It increases over time, except when it's at absolute zero, and it's related to the temperature of a system. In statistical thermodynamics, entropy is used to calculate the probability of different outcomes in a system. For example, if you have a system with more entropy, the probability of it ending in a certain state is higher than if the system had less entropy.
  • #1
infinitylord
34
1
Hi guys! I'm new to the forum and had several questions about entropy. I am a bit of a physics newbie by a lot of standards but I understand a lot of it and love physics. I can understand the basics of entropy that it is (correct me if I'm wrong) basically just disorder, that it is always increasing except if it is at absolute 0. It creates an arrow of time in the way that we can't go back on what has happened and bring the order back just as we can't reverse time to stop a volcanic eruption. Now I get the principles of it but how is it calculated and what does it mean when it is? there are the more basic forumula's for thermodynamics but I was referring to
S=-k·Σ[Pilog(Pi)]. So I know what most of it means,
k=boltzmann constant,
Σ = sigma,
in the brackets = the probability that a particle will be in a certain nanostate * by the logarithm of the same probability.
but how is this applicable? I want an example. I tried to do one by myself but i was most likely horribly wrong on what to do.
So i'd like an example with the math written out and what it means really. how do you know the probability of a particle being in a nanostate? once you have the number what's the unit in... J/K^-1? and what does it mean if you have a higher entropy?
sorry for the bombardment of questions... I'm just trying to wrap my head around this
 
Science news on Phys.org
  • #3
Thanks a ton.. this really helped!
 
  • #4
Sorry about reviving, but I've thought of entropy as, well, sort of information.
 
  • #5
yeah... I've got a better handle on it now. One good definition I read was "the amount of energy needed to complete a system." that's where it all just clicked for me I guess
 

Related to Understanding Entropy: Calculations and Examples for Beginners

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In other words, it is the amount of energy that is unavailable for work in a system.

2. How is entropy calculated?

Entropy is calculated using the formula S = k ln W, where S represents entropy, k is the Boltzmann constant, and W is the number of microstates or possible arrangements of a system.

3. Can you provide an example of entropy in action?

A classic example of entropy is an ice cube melting. The ice cube has a low entropy because its molecules are in a highly ordered state. As it melts, the water molecules become more disordered and the entropy increases.

4. How does entropy relate to the laws of thermodynamics?

Entropy is closely related to the second law of thermodynamics, which states that the total entropy in a closed system can never decrease over time. This means that systems tend to move towards a state of higher entropy.

5. How can understanding entropy be useful?

Understanding entropy is crucial in many fields, including physics, chemistry, and engineering. It helps us understand and predict the behavior of various systems, such as chemical reactions, phase transitions, and heat transfer. It also plays a significant role in the development of new technologies and processes.

Similar threads

  • Thermodynamics
Replies
26
Views
2K
  • Thermodynamics
Replies
2
Views
819
  • Thermodynamics
Replies
3
Views
826
  • Thermodynamics
Replies
1
Views
765
Replies
2
Views
873
Replies
1
Views
518
  • Thermodynamics
Replies
4
Views
2K
Replies
45
Views
3K
Replies
1
Views
810
Back
Top