How is entropy defined for cosmological phenomena?

In summary, entropy is an unsolved problem when it comes to defining it for phenomena taking place on a cosmological scale. While it is defined for equilibrium conditions in thermodynamics, it is not clear if this applies to cosmological phenomena. Gravity adds an extra layer of complexity, making it difficult to estimate entropy for self-gravitating systems. In classical mechanics, entropy can apply to any state, but the system must be at equilibrium for entropy to be maximized. There are attempts to define entropy for systems not in equilibrium, but it remains a debated and unsettled question. Additionally, understanding quantum gravity is necessary for calculating entropy in gravitational situations.
  • #1
Stephen Tashi
Science Advisor
7,861
1,599
How is entropy defined (if it is) for phenomena taking place on a cosmological scale?

Entropy in thermodynamics is defined for equilibrium conditions. Do we assume cosmological phenomena are approximated by equilibrium conditions?
 
Space news on Phys.org
  • #2
Stephen Tashi said:
How is entropy defined (if it is) for phenomena taking place on a cosmological scale?
In general, this is an unsolved problem. The difficulty isn't the scale so much as gravity: we don't know how to estimate the entropy of self-gravitating systems except in very special circumstances (e.g. black holes, a universe with only a cosmological constant). Presumably discovering the precise nature of quantum gravity would allow us to make these calculations.

Stephen Tashi said:
Entropy in thermodynamics is defined for equilibrium conditions. Do we assume cosmological phenomena are approximated by equilibrium conditions?
Not necessarily. Entropy can apply to any state in classical mechanics. A system is at equilibrium only when entropy is maximized, but there's a lot of calculations you can do for out-of-equilibrium systems as well.
 
  • #3
Chalnoth said:
In general, this is an unsolved problem.
I understand that calculating a quantity that has been defined may be an unsolved problem. There may also be problems in defining the quantity in the first place. My question concerns only how entropy is defined.

Not necessarily. Entropy can apply to any state in classical mechanics. A system is at equilibrium only when entropy is maximized, but there's a lot of calculations you can do for out-of-equilibrium systems as well.

How is Entropy defined for a system that is not in equilibrium?

Must a "system" denote a probabilistic model in order for Entropy to be defined? For example, a deterministic problem in elementary mechanics describes a system that is one state at one instant of time. So there is no non-trivial probability distribution for it being in several possible states.
 
  • #4
Stephen Tashi said:
How is Entropy defined for a system that is not in equilibrium?
In the way I wrote above. The counting of different microscopic states for the same large-scale features (e.g. temperature, pressure, density) makes no reference to equilibrium. It works regardless.

Stephen Tashi said:
Must a "system" denote a probabilistic model in order for Entropy to be defined? For example, a deterministic problem in elementary mechanics describes a system that is one state at one instant of time. So there is no non-trivial probability distribution for it being in several possible states.
In this case, the description is for a quantum system, which is inherently probabilistic after a fashion.

In a classic system, there are other ways to get at the same result for systems where quantum mechanics isn't relevant. But those classical calculations don't help us to calculate entropy for gravitational situations: we need to use the quantum calculation, and need to understand quantum gravity to be able to do that calculation.
 
  • #5
Chalnoth said:
In the way I wrote above. The counting of different microscopic states for the same large-scale features (e.g. temperature, pressure, density) makes no reference to equilibrium. It works regardless.

If we have a probability distribution then, of course, it's clear that we can define a Shannon entropy on the distribution. What isn't clear is whether there are "large scale" features that are sufficient to determine a particular probability distribution over the microstates for a system not in equilibrium - even if we specify that the distribution only applies "at time t".

For example, how is "the pressure" of a gas defined for a gas that is not in equilibrium? A quantity like "the average density" characterizes a gas in equilibrium since the density is the same everywhere in the gas. But how would "the average density" of a gas not in equilibrium distinguish among all the different ways a gas can exist in a nonequlibrium state?

When I search the web for "entropy in nonequlibrium states", I do find attempts to define entropy in nonequilibrium states (e.g. https://arxiv.org/abs/1305.3912 ), but the definition of entropy in nonequilibrium states appears to be a yet unsettled question.

In this case, the description is for a quantum system, which is inherently probabilistic after a fashion.

That clarifies how probability gets into the picture.

In a classic system, there are other ways to get at the same result for systems where quantum mechanics isn't relevant.
This is the point I don't understand. Everything I've read indicates there is no standard definition for entropy in nonequilibrium states. As I said, I agree that conceptually it is easy to think of defining entropy if we have a probability distribution on states. The problem is whether there exists any macroscopic information about nonequilibrium conditions that is sufficient to determine a probability distribution.

Perhaps that's what you mean by saying that entropy is difficult to compute?

But those classical calculations don't help us to calculate entropy for gravitational situations: we need to use the quantum calculation, and need to understand quantum gravity to be able to do that calculation.

I'll attempt to understand that point, after I've understood what to do in the non-gravitational non-equilibrium cases!
 
  • #6
Hmm, I think you're right about the difficulty of defining entropy for non-equilibrium conditions. Sorry. I misremembered my statistical mechanics.

I believe the issue is that when a system is in equilibrium, particularly thermal equilibrium, the mathematics describing the system become much simpler: for a gas, you can simply divide the system up into sections each with some volume, energy, and number of particles. These three parameters are sufficient to exactly define the entropy of the system, and what's more are also easy to measure for systems that are not in equilibrium. What I'd missed is that even though you can do the calculation to derive entropy from these parameters for any gas, no matter how far from equilibrium, the calculation is no longer valid because some of its assumptions have been broken.

That said, everything I wrote about counting the number of microscopic configurations of the system that leave the large-scale properties unchanged is correct, and is fundamental to how entropy is understood today. The difficulty with non-equilibrium systems is, I believe, that defining the set of large-scale parameters to use which fully-describe the non-equilibrium system isn't nearly so easy as just recording the volume, energy, and number of particles. There are multiple attempts at doing this, and they typically deal with near-equilibrium states.
 
  • #7
The issue comes down to "self-containment", which allows one to define a kind of equilibrium. There are two ways this has been done. The first defines a comoving volume in a homogeneous space such that there is zero net flux across the boundary. The second uses the cosmological event horizon as a boundary.

One attempt to quantify total entropy is in a paper by Egan and Lineweaver
"A Larger Estimate of the Entropy of the Universe"
ApJ 710: 1825-1834 Feb 2010
https://arxiv.org/abs/0909.3983
 
  • #8
spacejunkie said:
The issue comes down to "self-containment", which allows one to define a kind of equilibrium. There are two ways this has been done. The first defines a comoving volume in a homogeneous space such that there is zero net flux across the boundary. The second uses the cosmological event horizon as a boundary.

Yes, the paper you linked gives those two approaches as methods for applying the second law of thermodynamics, but, it doesn't say how to define entropy. The "comoving volume" paragraph says:
The total entropy in a sufficiently large comoving volume of the universe does not decrease with cosmic time.
However, it doesn't say how to define an entropy associated with such a volume.

My question is at a very basic level. Before we worry about how to calculate entropy, how do we define the quantity that we are calculating?

When I think of something like a galaxy, I don't think of it as being like a gas in equilibrium where the matter is uniformly distributed in space and individual particles have very random directions. Of course, a gas in a container sitting on a lab table is not profoundly influence by gravity. Perhaps a cold gas "out in space" would it settle in some sort of spiral or spherical shape. Does that sort of "stability" define a kind of equilibrium? How exactly is the entropy of such a formation of matter defined?

The paper says:

The system is effectively isolated because large-scale homogeneity and isotropy imply no netflows of entropy into or out of the comoving volume.

So is 'large-scale homogeneity and isotropy" a rigorously defined property? Is this saying that on a large scale, the distribution of matter in the comoving volume is approximately homogeneous even though the volume contains clumps of matter arranged in solar systems and galaxies?
 

Related to How is entropy defined for cosmological phenomena?

1. What is entropy in the context of cosmology?

In cosmology, entropy is a measure of the disorder or randomness of a system. It is a fundamental concept in thermodynamics and is used to describe the evolution and behavior of the universe.

2. How is entropy defined mathematically for cosmological phenomena?

In cosmology, entropy is defined as the ratio of the total energy of a system to its temperature. It is expressed using the equation S = k ln Ω, where S is the entropy, k is the Boltzmann constant, and Ω is the number of microstates associated with a particular macrostate.

3. What role does entropy play in the formation and evolution of the universe?

Entropy plays a crucial role in the formation and evolution of the universe. As the universe expands, the total amount of entropy increases, leading to an increase in disorder and a decrease in the availability of usable energy. This drives the universe towards a state of maximum entropy, also known as the heat death of the universe.

4. How does the second law of thermodynamics relate to entropy in cosmology?

The second law of thermodynamics states that the entropy of a closed system will always increase over time, or remain constant in ideal cases. This is consistent with the idea of the universe moving towards a state of maximum entropy, as described in cosmology. The second law also helps explain the directionality of time and the irreversible nature of certain processes in the universe.

5. Can entropy be decreased in cosmological systems?

No, the second law of thermodynamics states that entropy will always increase or remain constant. This means that, on a large scale, entropy cannot be decreased in cosmological systems. However, localized decreases in entropy can occur through the input of energy or through the formation of ordered structures, but these processes ultimately contribute to the overall increase of entropy in the universe.

Similar threads

Replies
1
Views
1K
Replies
5
Views
1K
Replies
33
Views
2K
Replies
7
Views
1K
Replies
1
Views
758
Replies
5
Views
969
Replies
34
Views
2K
Replies
12
Views
1K
Replies
13
Views
2K
Back
Top