Second Law of Thermodynamics - Postulate or Provable?

In summary: IS classical, if there is a non classical version of it i would be interested in knowing about it though, especially a "mathematical" proof of it.The 2nd law of thermodynamics can be derived from the first law of thermodynamics and the Kelvin and Clausius statements of the second law. The second law of thermodynamics is a statement about entropy and can be derived from the first law of thermodynamics and the Kelvin and Clausius statements of the second law.
  • #1
boderam
24
0
Second Law of Thermodynamics - Postulate or Provable??

I know that the 2nd law of thermodynamics is considered an experimental fact, but I have yet to find any sort of structure when I see a discussion of the law via proving it as a theorem. I know there are mathematical formulas that express the law, but are any of them derivable in the way that the conservation of momentum is derived from Newton's laws, i.e. from first principles?
 
Science news on Phys.org
  • #2


The second law as a statement about entropy is derived from the first law of thermodynamics, and the Kelvin and Clausius statements of the second law.
 
  • #3


atyy said:
The second law as a statement about entropy is derived from the first law of thermodynamics, and the Kelvin and Clausius statements of the second law.

so it is provable, assuming the kelvin/clausius statements are derivable from first principles as well...do you have a reference for this, every book and website i have come across so far doesn't have a real mathematical sort of proof for it, a lot of handwaving i find very frustrating.
 
  • #4


boderam said:
so it is provable, assuming the kelvin/clausius statements are derivable from first principles as well...do you have a reference for this, every book and website i have come across so far doesn't have a real mathematical sort of proof for it, a lot of handwaving i find very frustrating.

The Kelvin and Clausius statements are "common sense" postulates that seem to summarise lots of experiments - you cannot transfer heat from cold to hot without doing work. From those statements one can get to the mathematical statement that entropy stays the same or increases. Try Mehran Kardar's notes http://ocw.mit.edu/OcwWeb/Physics/8-333Fall-2007/LectureNotes/index.htm .
 
Last edited by a moderator:
  • #5


The second law of thermodynamics can be derived from the fact that heat flows from objects with higher temperature to objects with lower temperature. That's much more of a "first principle" than either Newton's laws or the conservation of momentum.
 
  • #6


I find the proposals here old-fashioned. Nowadays, general entropy can be applied just about to any abstract system like a deck of cards. And then it is not clear how to define "heat", "reversible processes" and "temperature" at all! Also I wouldn't call it a postulate as many people think the second law at macroscopic scales at least most likely won't be violated, but for small systems it may well be. So the derivation that I find most natural is saying that
"assuming all states are equally probable, a system will most likely go from a small set of states to the larger set of states". This is intuitive and can yield the equation for entropy:
https://www.physicsforums.com/showthread.php?t=353528&page=2
 
  • #7


The second law is not 'general'.
It has to be considered in systems that are 'closed systems'. This is an useful aproximation, as there exist no real closed systems, afaik.
There exist a range of applicability.
Choose a system, considerer it as aprox closed, define temp/entropy then yes.

The gravitation provides 'free energy' (is it ? I think not!) that changes everything.

From what we know the evolution of the universe at large is contrary to the 2nd law:
From a well mixed begining, the gravity starts to build more and more complex structures as time goes by: superclusters, clusters, galaxies,stars, planets, life, humans, brains.
Some radiation is lost, but stuctures are building up and 2nd law does not apply.

Then, to prove the 2nd law we have to clarify terms, limits of application, etc...
 
Last edited:
  • #8


All law's of thermodynamics are considered always because there has never been a situation where they have been wrong.

By the strictest definition (a mathematical proof) I don't believe there is one (although I'm not sure as my maths isn't as stong as it should be). For puroposes of use, it's proved correct by NEVER having been proven wrong.
 
  • #9


I definitely agree that laws of thermodynamics almost surely will never fail, because thermodynamics is about system with a huge number of particles.
Entropy in general however could be applied to non-thermodynamic system which have far less particles. Then it could fail more often.
 
  • #10


The OP was talking about the classical laws of thermodynamics, specifically mentioning the 2nd law.

I don't think he really wanted to consider statistical thermodynamics.
 
  • #11


xxChrisxx said:
The OP was talking about the classical laws of thermodynamics, specifically mentioning the 2nd law.

I don't think he really wanted to consider statistical thermodynamics.

i posted this in the classical section because i was under the impression the 2nd law of thermodynamics IS classical, if there is a non classical version of it i would be interested in knowing about it though, especially a "mathematical" proof of it. i watched some of susskind's statistical mechanics lectures on youtube, made it to the Boltzmann distribution and the mathematical definition of entropy, but the lecture on the 2nd law was very unclear to me. i am looking for a very clear/detailed mathematical proof, if there is one.
 
  • #12


boderam said:
i posted this in the classical section because i was under the impression the 2nd law of thermodynamics IS classical, if there is a non classical version of it i would be interested in knowing about it though, especially a "mathematical" proof of it. i watched some of susskind's statistical mechanics lectures on youtube, made it to the Boltzmann distribution and the mathematical definition of entropy, but the lecture on the 2nd law was very unclear to me. i am looking for a very clear/detailed mathematical proof, if there is one.

It is, Gerenuk was giving a statistical approach to entropy, which isn't quite the same thing.

There is no mathematical proof of the laws of thermodynamics.

You either have empirically derived formulas, that have never been proven wrong. OR A statisticial method
 
  • #13


xxChrisxx said:
It is, Gerenuk was giving a statistical approach to entropy, which isn't quite the same thing.

There is no mathematical proof of the laws of thermodynamics.

You either have empirically derived formulas, that have never been proven wrong. OR A statisticial method

so how are the two related then? mathematically the statements of the 2nd law look very different from statistical mechanics to classical thermodynamics, is there a way to "convert" one to the other via relating abstract states of the system to temperature? I'm getting a bit confused because they seem to be two different subjects yet they are both claiming a 2nd law about entropy. and saying there is a statistical method for the 2nd law, this does mean there is a derivation from first principles? this is what i am looking for. thank you for the discussion so far.
 
  • #14


xxChrisxx said:
The OP was talking about the classical laws of thermodynamics, specifically mentioning the 2nd law.
He was looking for a proof and the only proof I find reasonable is the statmech one. Of course one could derive the 2nd law from statements about heat flow, but that is really a shift of definition only. If one wishes to derive the law while ignoring the microscopic particles of thermodynamics, then this isn't possible from logic alone I believe. Then the dissatisfying answer is "no, you cannot prove the second law from logic".

boderam said:
i posted this in the classical section because i was under the impression the 2nd law of thermodynamics IS classical, if there is a non classical version of it i would be interested in knowing about it though
I suppose classical mean "non-quantum-mechanical". The second law isn't even really physics. It is a general probabilistic statement about general systems. So it is classical.

boderam said:
especially a "mathematical" proof of it. i watched some of susskind's statistical mechanics lectures on youtube, made it to the Boltzmann distribution and the mathematical definition of entropy, but the lecture on the 2nd law was very unclear to me.
Ask questions if you wish. However, I believe most lectures just define entropy and don't prove why it is increasing.

boderam said:
i am looking for a very clear/detailed mathematical proof, if there is one.
Have you looked at my earlier post I referenced?
To understand the increase of entropy I suggest the following computer experiment:
Your data is an array of numbers from 1 to N. For example "7,1,5,9,8,1,2,3,5,1,8,9,2" or so. For a quantity called entropy you calculate [itex]S=-\sum p_i\ln p_i[/itex] where [itex]p_i[/itex] is the fraction of the numbers which is equal to number i. The "physical time evolution" will correspond to changing one number at a random position to another randomly chosen number. Now you can observe what happens to the quantity S and you will find that is almost always increases. This law is more precise the more and larger numbers you deal with.
If you understand this computer program, then you understand why entropy increases. Stating this in mathematical probabilitic terms this is a proof from first principles.
One more point: you shouldn't start with a randomly chosen sequence of numbers, but a self-made structured one. Otherwise you most likely start the already a maximum entropy state.

boderam said:
so how are the two related then? mathematically the statements of the 2nd law look very different from statistical mechanics to classical thermodynamics
All of the thermodynamics laws can be derived from statmech, but not vice versa.
In the same forum link that I posted above you can find how to derive thermodynamics from the statmech definition.
(it might useful to have heard a lecture though)
 
  • #15


boderam said:
so how are the two related then? mathematically the statements of the 2nd law look very different from statistical mechanics to classical thermodynamics, is there a way to "convert" one to the other via relating abstract states of the system to temperature? I'm getting a bit confused because they seem to be two different subjects yet they are both claiming a 2nd law about entropy. and saying there is a statistical method for the 2nd law, this does mean there is a derivation from first principles? this is what i am looking for. thank you for the discussion so far.

I'll say before I carry on, I am not too hot at maths, and utterly dreadful at statistics.

*snip - probably wrong*

Statiscal methods are a wider method of calculating (or estimating as it's statistics) entropy in a system. They are different becuase they look at different things.

Classical method is a system (macroscopic view). So if we had a box of gas, we could tell you thermodynamic information about it. Such as entropy, etc.

Statistical methos is a microspoic approach that can be used to find a system value: this metod will use probability to find a distribution and average entropy for every single molecule. These are then summed to get a system value.

The classical method had no way to predict the value for a single molecule, but statistical thermodynamics does.As far as I can remember anyway. I'm kind of relying on someone better to correct any mistakes, which I am sure they will jump at :P

EDIT: You beat me to it :P
 
  • #16


Boderam, I'm no expert myself, but I've researched this question thoroughly. There is no formal proof of the law of increasing entropy, at least as it applies to an ideal gas. Boltzmann himself, after it was brought to his attention by one of his students, stated that it, and all its derivatives, are based on the assumption of molecular chaos. Maxwell made the assumption implicitly while deriving his distribution of molecular speeds in an ideal gas.

It was a reasonable assumption, and seems to hold true for all the gases we've ever dealt with.
 
  • #17
boderam said:
I know that the 2nd law of thermodynamics is considered an experimental fact, but I have yet to find any sort of structure when I see a discussion of the law via proving it as a theorem. I know there are mathematical formulas that express the law, but are any of them derivable in the way that the conservation of momentum is derived from Newton's laws, i.e. from first principles?

Setting aside the fact that the conservation of momentum not really "derived" from Newton's Law F = dp/dt, the second Law of Thermodynamics is based purely on observation- heat never flows from a cold body to a hot one. Microscopically, the entropy (as do all physical properties) can and does fluctuate (the fluctuation-dissipation theorem).

To be clear- macroscopically, the Second law has never been observed to be false, while microscopically it has been observed to be violated (for short times):

http://www.aip.org/enews/physnews/2002/598.html
http://prola.aps.org/abstract/PRL/v89/i5/e050601
 
Last edited by a moderator:
  • #18


Andy Resnick said:
To be clear- macroscopically, the Second law has never been observed to be false, while microscopically it has been observed to be violated (for short times):
That is an interesting reference :smile:
Is there a theory that calculates how much the entropy will fluctuate once it reaches its maximum?
I think usual parameter fluctuation theory assumes that entropy is at a constant maximum.
 
  • #19


Gerenuk said:
To understand the increase of entropy I suggest the following computer experiment:
Your data is an array of numbers from 1 to N. For example "7,1,5,9,8,1,2,3,5,1,8,9,2" or so.
For demonstration I wrote this python program. Here is the program
Code:
from __future__ import division
from random import randrange
from math import log
import sys,os
P=[100,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0] # all 100 particles in the first state

f=file("entropy.dat","w")
tot=sum(P)
for a in range(1000):
  for b in range(10):
    while 1:
      j=randrange(len(P))
      if P[j]>0: break
    while 1:
      i=randrange(len(P))
      if i!=j: break
    P[i]+=1
    P[j]-=1
    S=-sum((p/tot*log(p/tot) if p>0 else 0) for p in P)/log(len(P))
  print >>f,a,S
f.close()
os.system("gnuplot -persist - <<< 'plot [0:*] [0:1] \"entropy.dat\" t \"\"'")
and the output is attached.

So basically entropy has nothing to do with physics or dynamical processes. It is a purely probabilistic statement equivalent to saying "for coin tosses the most probable ratio of head to tails is 1:1".
 

Attachments

  • entropy.jpg
    entropy.jpg
    18.1 KB · Views: 479
  • #20


Gerenuk said:
That is an interesting reference :smile:
Is there a theory that calculates how much the entropy will fluctuate once it reaches its maximum?
I think usual parameter fluctuation theory assumes that entropy is at a constant maximum.

I'm not sure... it's somewhat outside my expertise.
 
  • #21
Andy Resnick said:
Setting aside the fact that the conservation of momentum not really "derived" from Newton's Law F = dp/dt, the second Law of Thermodynamics is based purely on observation- heat never flows from a cold body to a hot one. Microscopically, the entropy (as do all physical properties) can and does fluctuate (the fluctuation-dissipation theorem).

To be clear- macroscopically, the Second law has never been observed to be false, while microscopically it has been observed to be violated (for short times):

http://www.aip.org/enews/physnews/2002/598.html
http://prola.aps.org/abstract/PRL/v89/i5/e050601

One needs to be careful when dealing with the Second Law. It is common to blindly extend that law from thermodynamic engines to the rest of the universe. The problem is that thermodynamically a number of things are assumed. One is random chaos (as in a heated gas) and the other is that the statistics of the Second law depend only on that chaos.

This makes the Second law valid ONLY for a limited subset of situations. To make the (false) observations that systems never proceed from disorder to order ignores some very important extensions. Those extension are that INFORMATION also represents the quantity entropy! So the weeds (order) growing under my brick sidewalk springing from the dirt (disorder) there represents a reversal of the mechanistic 19th century view of thermodynamics. In fact ALL LIFE represents that, but life is the one thing physicists do not wish to confront.

Thus, the information content of DNA in life is a prime factor in considering the nature of life and the entropy of living systems. However, more and more physicists are taking time to understand the nature of information in entropy. Indeed, as Maxwell correctly surmised with his "demon", use of the "information" of the velocity vectors of molecules can create a reversal of the accepted direction of entropy values. For more understanding of the nature of the relationship of entropy to information I refer you to begin with the classic works of C. E. Shannon.
 
Last edited by a moderator:
  • #22


bjacoby said:
One needs to be careful when dealing with the Second Law.
This makes the Second law valid ONLY for a limited subset of situations. To make the (false) observations that systems never proceed from disorder to order ignores some very important extensions. Those extension are that INFORMATION also represents the quantity entropy! So the weeds (order) growing under my brick sidewalk springing from the dirt (disorder) there represents a reversal of the mechanistic 19th century view of thermodynamics. In fact ALL LIFE represents that, but life is the one thing physicists do not wish to confront.

This is not coherent, sorry. You are comparing apples and oranges.

The second law is a statement about thermodynamics. It deals with heat, and entropy, and energy, and so on.

It is not a statement about information in general. Speaking of the second law as if it deals with "order" in the sense you are using (functional or meaningful organization of things) is a bit like using the theory of relativity to help understand your relatives and family tree.

There are all kinds of reasons it that the example of growing weeds is mistaken.

Weeds grow from seeds, right? But the material substance of the weed is taken from the nutrients and the material it takes from the environment and incorporates into its weedy body. The laws of thermodynamics make perfect sense for looking at the growth of weeds, but they have nothing (and I mean NOTHING) to do with the comparison of a seed and the parent weed. Thermodynamics is all about the physical organization of the matter that makes up the weed, NOT the functional development processes involved in altering the forms of the weed from seed to sprout to adult weed.

To do a thermodynamic analysis of a weed would be to consider the thermodynamic changes in atoms and molecules taken from the atmosphere and the soil, and chemically altered into the structure of the plant. There is no difference -- NONE -- in the way thermodynamics works for this or any other chemical reaction. And the physical/chemical/thermodynamic processes for growth of weeds are perfectly able to be studied and understood by physicists and chemists. It all gets quite intricate and complicated of course, but there's nothing different about thermodynamics for life and for non-life. The second law continues to apply, in the same way, with no modifications or special cases involved.

The argument that life, or evolution, violates any thermodynamic laws is always based on this confusion. Thermodynamics says nothing about the evolution of one species from another. It rather deals with the energetic processes of each new generation. You may have evolved "from a protoplasmic primordial atomic globule" (ref: Mikado), and grown from a tiny fertilized egg. But the physical stuff you are made of is taken from what you have consumed along the way... and thermodynamics is about how a physical system changes. It is not about the comparison of ancestor/parent or other such relations between one system and another one.

Cheers -- sylas
 
  • #23


bjacoby said:
<snip>

This makes the Second law valid ONLY for a limited subset of situations. <snip>.

On the contrary, the second law applies to more systems than any other physical law (AFAIK).
 
  • #24


bjacoby, The Second Law of Thermodynamics is often wildly misunderstood. The simple description of it that 'order goes to disorder' leaves out the important point that this is true in a closed system. If you leave a disordered bunch of soil in isolation then it will not in general turn into a weed. However, soil on the Earth is constantly bombarded with energy from the Sun, which at the level of an overall thermodynamic analysis of the soil system is more than enough to provide the energy to order the material into a weed.

What you are suggesting is that it is impossible to make a wine glass from glass fragments. Clearly in a closed systems you can go from ordered (glass) to disordered (shattered glass), but not the other way around. However add some heat to melt the glass and work it back into shape and you have made the system more ordered once again.

Likewise, the 2nd law can be phrased "heat goes to cold" but in the same way, the addition of energy can make things go the other way, as in the case of your fridge.

So yes, information theory and entropy are intertwined, but you're wrong to suggest it is only valid in a subset of cases. It is universal law, both in theory and from observation (as far as we know, although it gets a bit tricky when you consider some aspects of black holes and the Big Bang, but these are not fully understood in any case).

By the way, I think the legions of Biological Physicists and Medical Physicists would probably take issue with your assertion that they don't care about life!
 
  • #25


bjacoby said:
One needs to be careful when dealing with the Second Law. It is common to blindly extend that law from thermodynamic engines to the rest of the universe. The problem is that thermodynamically a number of things are assumed. One is random chaos (as in a heated gas) and the other is that the statistics of the Second law depend only on that chaos.

This makes the Second law valid ONLY for a limited subset of situations. To make the (false) observations that systems never proceed from disorder to order ignores some very important extensions. Those extension are that INFORMATION also represents the quantity entropy! So the weeds (order) growing under my brick sidewalk springing from the dirt (disorder) there represents a reversal of the mechanistic 19th century view of thermodynamics. In fact ALL LIFE represents that, but life is the one thing physicists do not wish to confront.

Thus, the information content of DNA in life is a prime factor in considering the nature of life and the entropy of living systems. However, more and more physicists are taking time to understand the nature of information in entropy. Indeed, as Maxwell correctly surmised with his "demon", use of the "information" of the velocity vectors of molecules can create a reversal of the accepted direction of entropy values. For more understanding of the nature of the relationship of entropy to information I refer you to begin with the classic works of C. E. Shannon.

I do not see how your example here invalidates the 2nd law. Are you arguing that the evolution of life is an example of such violation? It better not be because this is a very tired argument.

Furthermore, to claim that "life" isn't something physicists want to confront is also false. If you look at the 2 recent papers on addressing evolution and entropy, you would have seen such example to falsify your claim[1,2]

These are detailed analysis that people ignore, or can't do, when they make hand-waving arguments about the nature of the 2nd law and "life".

Zz.

[1] D. Styer, Am. J. Phys. v.76, 1031 (2008).
[2] E.F. Bunn, Am. J. Phys. v.77, p.922 (2009) (http://arxiv.org/abs/0903.4603" )
 
Last edited by a moderator:
  • #26


People apply entropy successfully to just anything that has "states" (particles positions, spin orientations, deck of cards, ...)

Does anyone know a derivation for entropy that is general enough to enclose all these uses? Because the second law as it stands cannot be a real physical law, since what exactly is meant by state is subject to interpretation. And "fundamental" physical laws should never be so vague as to allow different interpretations how to apply the law. So there must be some kind of dynamical or microscopical derivation (note that a purely QM derivation would not be able to enclose the more general applications of entropy).
So what is a good derivation?

To my knowledge, the derivations make assumptions that all states should be equally likely and show no correlations in their dynamics. But for most problems it isn't possible to find such states. In a way the second law is restricted to random, mindless physical problems.

So there wouldn't even be a problem if life or whatever breaks this law.
 
Last edited:
  • #27


That's a good point: what exactly is meant by 'thermodynamic state'? What are the state variables? How many are needed?

I don't know if there is yet a definitive answer; the most recent stuff I have seen is "rational thermodynamics" by Truesdell etc. I didn't quite understand all of it.
 

Related to Second Law of Thermodynamics - Postulate or Provable?

1. What is the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of an isolated system will always increase over time. In simpler terms, it means that the natural direction of processes is towards disorder and less usable energy.

2. Is the Second Law of Thermodynamics a postulate or a provable law?

The Second Law of Thermodynamics is both a postulate and a provable law. It is a postulate because it is based on fundamental assumptions about the nature of energy and entropy. However, it is also a provable law because it has been repeatedly tested and confirmed through experiments and observations.

3. What is the relationship between the Second Law of Thermodynamics and the concept of entropy?

Entropy is a measure of the disorder or randomness of a system. The Second Law of Thermodynamics states that the total entropy of an isolated system will always increase, which means that the system will become more disordered over time.

4. Does the Second Law of Thermodynamics apply to all systems?

Yes, the Second Law of Thermodynamics applies to all systems, regardless of size or complexity. It is a fundamental law of nature and applies to everything from microscopic particles to the entire universe.

5. Can the Second Law of Thermodynamics be violated?

No, the Second Law of Thermodynamics cannot be violated. It is a fundamental law of nature and has been proven to hold true in all observed cases. While it may seem like some processes may violate the law, upon closer examination, they can be explained by other factors such as energy input from an external source.

Similar threads

Replies
12
Views
2K
Replies
100
Views
6K
Replies
2
Views
1K
Replies
6
Views
1K
Replies
13
Views
2K
Replies
4
Views
893
Replies
1
Views
1K
Replies
4
Views
2K
  • Thermodynamics
Replies
10
Views
2K
Replies
16
Views
986
Back
Top