A naive question about entropy.

  • Thread starter saching
  • Start date
  • Tags
    Entropy
In summary: So any information needed to describe the entropy content of the universe can propagate from any one point to any other point at or slower than lightspeed.In summary, the concept of entropy in a closed system always increasing applies to the universe as a whole, as it is considered a closed system. This is because the universe is constantly expanding, creating more ways in which things can be arranged and increasing entropy. While relativity does play a role in discussing entropy, it is not necessary to have instantaneous communication between all points in space. The conservation of entropy locally means that information about the entropy content of the universe can propagate at or below the speed of light.
  • #1
saching
6
0
I've constantly heard professors answer questions about entropy with a statement about how even if the entropy of a system is decreasing the entropy of the universe is increasing. How can we discuss the entropy of the universe if information can't be communicated between all points in space at the same time? Or rather, how does relativity mix with entropy, if at all?

edit: I just realized that this might have more to do with relativity than with thermodynamics. Moderators feel free to move this post.
 
Last edited:
Science news on Phys.org
  • #2
You can make arguments based on the isotropic nature of the universe, that if a law holds somewhere, it must hold at every other point. In other words, the behaviour of a thermodynamic system shouldn't depend on where in the universe the system happens to be sitting.

Claude.
 
  • #3
I always understood entropy as the number of ways something can be arranged. Since the universe is expanding, there are constantly more ways in which things can be arranged, thus entropy must be increasing.
 
  • #4
saching said:
I've constantly heard professors answer questions about entropy with a statement about how even if the entropy of a system is decreasing the entropy of the universe is increasing. How can we discuss the entropy of the universe if information can't be communicated between all points in space at the same time? Or rather, how does relativity mix with entropy, if at all?

edit: I just realized that this might have more to do with relativity than with thermodynamics. Moderators feel free to move this post.
I have generally heard it expressed that the entropy of any closed system is always increasing. I guess the extension to the universe is just considering the universe as a whole one big closed system.
 
  • #5
saching said:
I've constantly heard professors answer questions about entropy with a statement about how even if the entropy of a system is decreasing the entropy of the universe is increasing.

You should consider this statement as more elementary than it is made to sound. What is indeed considered a true law, is that entropy in a sufficiently closed system is always increasing. So if you have a subsystem in which entropy is decreasing, that means it is not closed enough, and is interacting with some part of its neighbourhood (for instance, exchanging heat or matter with it). So you now have to include that neighbourhood in the system. If you do that sufficiently, then, for the process at hand, you will sooner or later be able to indicate that the system is now big enough so that no relevant interaction you are considering still makes it "open".

Some silly examples: Consider water in a bucket on a cold day. It freezes. Entropy of the water has decreased ! Ok, but you realize that your bucket is not "closed". The water lost some heat to the soil and the air. So you now include a chunk of soil and air. CONCERNING the cooling bucket, that will do. The soil will heat a bit, and the air will heat a bit, and their increase of entropy because of that will be larger than the decrease in entropy of the water.
But of course, that soil and air will be exposed to other processes (earth's atmosphere, sun, ...). So it might be that overall, their entropy is also decreasing. But this didn't have anything to do anymore with the bucket. So concerning the process of freezing water, it is sufficient to include just the soil and the air of the immediate neighbourhood. But in all generality, you'd have to include the Earth atmosphere, the earth, the sun, the solar system, the galaxy,...
Only, from a certain point onwards, you realize that this doesn't have anything to do anymore with the freezing bucket. That's what's meant with a sufficiently closed system.
 
  • #6
To answer your question, entropy is (para)conserved locally, so no superluminal transmission of information is needed. Entropy is created whenever a flux arises due to a gradient in energy, matter, momentum, etc., and these fluxes are always at or slower than the speed of light.
 

Related to A naive question about entropy.

What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is often referred to as the amount of energy that is no longer available to do work.

How is entropy related to thermodynamics?

Entropy is a fundamental concept in thermodynamics. In thermodynamics, entropy is a measure of the disorder of a system and is closely related to the second law of thermodynamics, which states that the total entropy of a closed system always increases over time.

Can entropy be reversed?

The second law of thermodynamics states that the total entropy of a closed system can never decrease over time. However, local decreases in entropy are possible if the overall entropy of the system increases. This means that while a small part of the system may become more ordered, the rest of the system becomes more disordered.

What is the relationship between entropy and information?

Entropy and information are closely related concepts. In information theory, entropy is a measure of the uncertainty or randomness of a message. The higher the entropy, the less predictable the message is. This is because a message with high entropy contains more "surprising" information.

Does entropy only apply to physical systems?

While entropy is most commonly associated with physical systems, it can also be applied to other systems such as biological systems, economic systems, and social systems. In these cases, entropy is a measure of disorder or randomness in the system, rather than in the physical sense of energy.

Similar threads

Replies
13
Views
1K
  • Thermodynamics
Replies
26
Views
2K
  • Thermodynamics
Replies
2
Views
803
  • Thermodynamics
Replies
3
Views
803
Replies
9
Views
6K
  • Thermodynamics
Replies
2
Views
10K
Replies
2
Views
859
  • Thermodynamics
Replies
19
Views
2K
  • Special and General Relativity
Replies
7
Views
401
Back
Top