Does a probability distribution correctly describe entropy?

In summary: Mathematically speaking, "new" potential for work would correspond to a reduction in entropy, and in practice this would usually correspond to the appearance of gas molecules at a location where they did not exist before.In summary, the statistical mechanics explanation of entropy as if it is caused by probability is unsatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability.
  • #1
Charlie313
6
0
The colloquial statistical mechanics explanation of entropy as if it is caused by probability is dissatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability. But as far as I know (not a physicist!) we don't even see tiny, less improbable but still significant fluctuations toward 'new' potential for work, much less the big, super-improbable ones. Is there a constraint on the fluctuations of 'random' systems like gas molecules in a box that would not appear if we simply add the probabilities at equilibrium?

Another way of asking the same question: are there experimentally supported equations for how the probability distribution for a volume of gas or other entropically constrained system changes as the system begins to fluctuate away from maximum (i.e. equilibrium) entropy and toward some significant potential for work? My dissatisfaction with the statistical 'explanation' is in part because the arrangements of molecules in a box of gas are self-interacting, so that any shift in a counter-entropic direction, and toward ‘free’ work, should (at least to my layman's thinking) change the probability distributions in nonlinear ways that might reduce 'very highly improbable' to zero probability. Mathematical answers are welcome (‘are there equations?’), but I am a visual and intuitive not a mathematical thinker so translations into non-math or intuitive concepts would be greatly appreciated. Thanks!
 
Science news on Phys.org
  • #2
Charlie313 said:
But as far as I know (not a physicist!) we don't even see tiny, less improbable but still significant fluctuations toward 'new' potential for work
We see fluctuations as large as expected with the limited number of observations.

There is a difference between seeing a 1 in a billion event (easy if you look every nanosecond) and a 1 in 101010 event, but there is no practical difference between 1 in 101010 and 101020 - we won't see either.
 
  • #3
Thanks mfb for the response and that's a good point. (playing the license plate game w/ mfb, I got mondo-freaking brilliant, or something to that effect :D) -- Still wondering if someone can point me toward discussion of how the prob distributions change for gas in a box as it hypothetically fluctuates away from equilibrium and toward the 'very small but real' arrangement that noticeably reduces entropy and increases potential--or even in a non-equilibrium arrangement, like hot on one side, evolving toward equilibrium; how 'smooth' is the curve of decreasing or increasing entropy and how does it change as we move the system toward or away from equilibrium? Of course it will have molecule-level fluctuations, since molecules are the fundamental unit in which the system's randomness and probability structure is defined. And there will be little coincidences where, say, little groups or waves of hot molecules move toward the hot side, temporarily 'reducing entropy' on a very local scale, but not for the whole system. What I am trying to figure out (in my somewhat impaired way) is what effect a larger-scale movement toward one of those very rare (1/10^10^10 or rarer) large-scale fluctuations toward lower entropy that would allow 'new' work to be gotten out of the system would have on the probability distribution itself. I think the M-B distribution only applies at equilibrium; what formalism if any describes the changing distribution as the system as a whole is moving toward or (very improbably) away from equilibrium? (trying to think of search phrases that might catch that). Thanks again!
 
Last edited:
  • #4
To get all atoms at the same side of the room, you don't need a deviation from MB. In the limit of an ideal gas, all the molecules are collision-free (or at least with collision timescales longer than relevant, with collisions only at the walls), they will be in each half of the room with 50% probability, independent of their velocity. 10 atoms give you a 1/512 probability to have all at the same side, 20 atoms lead to a 1/500,000 probability, 100 atoms to 1 in 0.5*1030, and so on. With 20 atoms that is something you can wait for, with 100 atoms it is not, and with 1030 atoms it just doesn't happen, although there is a non-zero chance.

Even a different temperature doesn't need a deviation from MB. You just need the faster atoms at one side by chance.

A deviation from MB is possible as well, but different from the two scenarios above.
 
  • #5
Charlie313 said:
Still wondering if someone can point me toward discussion of how the prob distributions change for gas in a box as it hypothetically fluctuates away from equilibrium and toward the 'very small but real' arrangement that noticeably reduces entropy and increases potential

If you are thinking of a gas as collection of particles, each of which has a definite position and velocity at a given time, there is no probability involved and it has no defined entropy. It's like thinking of a "fair coin" that has already been tossed and definitely landed heads. That's why statistical mechanics is forced to use the tortuous language of "ensembles" of systems.

If we want to talk about "fluctuations", we must specify exactly what is fluctuating. Trying to speak of "probability" as a general abstraction is not mathematically coherent. One must specify what events are in the probability space (i.e. "probability" must be the probability of some set of events.) So how would you formulate your question so it has a clear meaning? What probability are you asking about?

--or even in a non-equilibrium arrangement, like hot on one side, evolving toward equilibrium; how 'smooth' is the curve of decreasing or increasing entropy and how does it change as we move the system toward or away from equilibrium?

Thermodynamic entropy is not defined for (an ensemble of) gases that are not in equilibrium. So the first problem would be to invent a definition for entropy in non-equilibrium situations.
 

Related to Does a probability distribution correctly describe entropy?

What is a probability distribution?

A probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment or event. It can be represented in the form of a graph or a table.

What is entropy?

Entropy is a measure of the randomness or disorder in a system. In scientific terms, it is the amount of uncertainty or information contained in a system. It is commonly used in thermodynamics, information theory, and statistics.

How is entropy related to probability distribution?

Entropy and probability distribution are closely related. In fact, the concept of entropy was first introduced in the field of thermodynamics to describe the randomness or disorder of a system. In information theory, entropy is used to measure the uncertainty in a probability distribution.

Can a probability distribution accurately describe entropy?

Yes, a probability distribution can accurately describe entropy. In fact, entropy is often calculated using probability distributions in various scientific fields such as thermodynamics, information theory, and statistics. However, it is important to note that different probability distributions may result in different values of entropy.

Are there any limitations to using probability distribution to describe entropy?

Yes, there are limitations to using probability distribution to describe entropy. One limitation is that it can only describe the randomness or disorder of a system based on the available data or information. If the data is incomplete or inaccurate, the calculated entropy may not accurately represent the true entropy of the system. Additionally, different probability distributions may result in different values of entropy, so it is important to carefully choose the appropriate distribution for a given system.

Similar threads

Replies
2
Views
870
  • Thermodynamics
Replies
3
Views
1K
  • Thermodynamics
Replies
4
Views
2K
  • Thermodynamics
Replies
1
Views
762
Replies
15
Views
989
Replies
5
Views
2K
Replies
6
Views
1K
  • Thermodynamics
Replies
7
Views
2K
Replies
10
Views
1K
Back
Top