Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Hello. I recently discovered Gerard 't Hooft's (what a complicated name to type, isn't it?*apostrophe*apostrophe*apostrophe) equation for the entropy of a simple black hole (what is meant by "simple" I have no idea). It is:
Where "S" is the entropy of a simple black hole
A is the area of the...
The colloquial statistical mechanics explanation of entropy as if it is caused by probability is dissatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability. But as far as...
If all the matter in the universe is eventually headed towards dis-integration into it's most basic form. Not sure what that is but for this thought experiment, let's say its single protons.
What would happen if all those protons formed a single mass? Would that be a singularity exhibiting...
So say I smash a glass plate on a chess board much larger than the plate. Simplistically, say entropy is the number of ways of rearranging the glass pieces across the squares of the board. Over time, it's likely that entropy increases since the glass would spread out, meaning each configuration...
So when my dad first explained the fundamental idea of thermodynamics to me, that entropy never decreases, he pointed out the odd fact that according to basically all other laws of physics, any motion or reaction could be run backwards and be just as valid as it is run forwards. It would break...
Hello! I have this GRE question:
In process 1, a monoatomic ideal gas is heated from temperature T to temperature 2T reversibly and at constant temperature. In process 2, a monoatomic ideal gas freely expands from V to 2V. Which is the correct relationship between the change in entropy ##\Delta...
I was watching The Feymann online lectures and he talked about the arrow of time.And entropy etc.
I have some questions.
1-Can ve say non-conservative force are time irreversible, but conservative force are time reversible ?
2-So From one , If a non-conservatice force acts on a system that...
Homework Statement
A flue gas is cooled from 1100 C to 150 C and the heat is used to generate saturated steam at 100 C in a boiler. The flue gas has a heat capacity given by CP/R = 3.83 + 0.000551 T, where T is in K. Water enters the boiler at 100 C and is vaporized at this temperature. Its...
Hunt & Ott 2015, Defining Chaos
NB: For a more introductory version, phys.org ran a piece on this article two summers ago
This paper was published as a review of the concept of chaos in the journal Chaos for the 25th anniversary of that journal. The abstract is extended with a clearer...
Is there any entropic gain when the surface of a liquid is minimised? Per example, molecules "enjoy" maximum entropy when they are at the interior. Is this valid?
Dear All Gravitinos,
I write this post here to discuss a new conjecture on resolutions of the schwarzschild singularity and the physics interpretation for the micro states of black-holes (arxiv: 1606.06178, published in Nucl. Phys. B2017,02,005...
I am only aware that the formula has to do with entropy/thermodynamics. I could really use the help on how it applies in physics and what the formula is really about.
Hello,
I am currently trying to get my head around the concept of entropy. One way to understand it is that it can be related to the amount of available energy levels in a system.
From what I read, the availability of energy levels in a system:
1) increase with an increase in the system...
I was reading some articles related to entropy and I come to know that,
The term “Entropy” shows up both in thermodynamics and information theory.
Now my question is :
What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?
I need some clear and...
Homework Statement
As you know it's finals time and I desperatly need help with one physics task (it's not my main subject, but I still have to pass it -.-). Here it is: According to data below count temperature-depended entropy and entropy itself for reaction: CH4 + 2 O2 = CO2 + 2 H2O CH4...
Hi folks, I have a question, I will first write down some old truths and then ask what is unclear to me.
Now we know that Boltzmann's constant is the average kinetic energy of one molecule of ideal gas, related to its temperature T.
The German scientist Clausius defined entropy change of some...
How is entropy defined (if it is) for phenomena taking place on a cosmological scale?
Entropy in thermodynamics is defined for equilibrium conditions. Do we assume cosmological phenomena are approximated by equilibrium conditions?
Hi.
If an ideal gas of ##N## particles is allowed to expand isothermically to double its initial volume, the entropy increase is
$$\Delta S=N\cdot k_B \cdot \log\left(\frac{V_f}{V_i}\right)=N\cdot k_B \cdot \log\left(\frac{2V}{V}\right)=N\cdot k_B \cdot \log\left(2\right)\enspace .$$
This can...
is incresing the entropy of low entropy system easier than trying to increase the entropy of a high entropy system?
or is it vice versa?
let's say it requires x amount of energy to increase a low entropy system, now will increasing an already high entropy system require 2x amount of energy or...
Gravity tends to make ordered structures of free particles. Does this mean that gravity is decreasing the entropy of these particles, or is there some compensating mechanisms in order to let the entropy increase (for example the emergence of gravity waves, though I doubt that's enough to compensate.
Consider three identical boxes of volume V. the first two boxes will contain particles of two different species 'N' and 'n'.
The first box contains 'N' identical non interacting particles in a volume V. The second box contains 'n' non interacting particles. The third box is the result of mixing...
Those treatments of Entropy in continuum mechanics that I've viewed on the web introduce Entropy abruptly, as if it is a fundamental property of matter. For example the current Wikepedia article on continuum mechanics ( https://en.wikipedia.org/wiki/Continuum_mechanics ) says:
Are other...
Hi I've been wondering about Boltzmann's equation
S = k ln W
Where W is the number of different distinguishable microscopic states of a system.
What I don't get is that if it's the position and velocity of a particle that describes a microstate doesn't it mean that W would be infinite...
Homework Statement
By applying the first law to a quasi static process, show that the entropy can be expressed as
S = (16σ/3c) VT3
Homework Equations
U = 4(σ/c) VT4
PV = 1/3 U[/B]The Attempt at a Solution
I know I should be using
dS = dQ/T but it's unclear to me how to use this unless I...
Homework Statement
Four distinguishable particles move freely in a room divided into octants (there are no actual partitions). Let the basic states be given by specifying the octant in which each particle is located.
1. How many basic states are there?
2. The door to this room is opened...
Bell's theorem debunks theories concerning local hidden variables.
Many people interpret that as the complete absence of local hidden variables.
Hidden variable theories were espoused by some physicists who argued that the state of a physical system, as formulated by quantum mechanics, does not...
I have trouble understanding why we classify an inviscid adiabatic incompressible flow along a streamline as isentropic
I understand this from a Thermodynamic definition/explanation
$$dS = dQ/T$$
Adiabatic Invsicid
$$dQ =0= dS$$
So no heat added or lost no change in entropy I'm fine with that...
Hi community,
I'm trying to get my head round all these concepts.
So entropy is given by S= k ln W. where W is the number of microstates of a system. Know let's imagine there is a box containing a number of gas atoms let's say the gas atoms have a current position and velocity and say you can...
I'd like to create a simple model that demonstrates the basic values of thermodinamics of an ideal gas. I begin with two rooms, several molecules in them. Every data of every individual molecule is given (position, mass, speed, etc), so I can easily calculate the total energy, pressure...
Hi.
I read this thread with great interest and have similar question:
In a deterministic universe, does entropy exist for Laplace's demon? Since he knows the universe to it's microstate, does the term "macrostate" even make sense to him?
And say there is a "half-demon" that only knows the...
Let's imagine a deterministic universe. A one where quantum mechanics simply doesn't apply. Ok.
This was the universe of classical physics. Atoms exist, and they behave deterministically. Fine. Now, how can entropy increase in this universe, altough it has the same laws of physics. In a...
I'm trying to read this paper. Right now my problem is with equations 3.16 and 3.17.
I understand that in equation 3.16 we're putting some boundary conditions on the fields, but I have two problems with these boundary conditions:
1) The fields depend on both ## t_E ## and ## x##, i.e. ##...
Is there any way to measure time without reference entropy? i.e suppose the universe has maximum entropy , is there any way to define a sense of time "after" that?
Homework Statement
Hello, i am given an isothermal transition for nitrogen, N2, where temperature is constant at 700K, p1=1bar, and p2=100bar.
For this problem i am not allowed to use any equations of states, such as Benedict-Webb-Rubin, or Beattie-Bridgeman. Rather i am given only Cp data...
Hello, folks! So, I've come across this question on my Physics homework, and I'm not entirely sure how to finish all the parts. I've included the parts I've gotten correct and what I've gone to get those answers.
Use the exact values you enter in previous answer(s) to make later calculation(s)...
From the paper, https://arxiv.org/abs/astro-ph/0305015, on page 3,
How did the author arrived with equation (10)? By using the radiation density and (8) defined in the previous paragraph,
## ρ_r = \frac{3}{4}Ts ~## (radiation density) I think the author got it wrong (## ρ_r = \frac{4}{3}Ts ~##)...
Hello,
Few years back I was reading about calculating multiplicity using hypercube (n-cube). Multiplicity was normalized using this method. I wanted to read it again but I just cannot find it now. I tried every combination of keywords. I remember it was a Wikipedia link. Any help would be...
A closed, well-insulated container is filled with 454 g of water at 94.4 °C. To the hot water, 200 g of water ice at exactly 0 °C is added. The mixture reaches an equilibrium temperature of 41.1 °C. Assume the molar heat capacity is constant and all the processes are at constant pressure. The...
Hello everyone!
I am an undergraduate student from Greece in my first semester of Mechanical Engineering, but I am fascinated with physics. I've been studying some physics books from my university's library and reached the chapter of entropy. I understand the 2nd law of thermodynamics but what...
When we talk about entropy, we say it comes from our inability to completely describe the state of a system. Also, we say it is a property of the system (like entalphy). That's confusing me a lot. If entropy is a property, how can it come from our inability to describe the system? Or it's just a...
Homework Statement
The problem requires me to find the entropy of a diffusion constant as a function of time (I guess in terms of diffusion coefficient)
Homework Equations
Perhaps Heat / Diffusion kernel
S = k p lnp
The Attempt at a Solution
I assume it was a delta initial condition then...
I learned that
$$ dS = \frac Q T$$
In free expansion of Ideal gas, it is obvious that Q = 0. However, the entropy increases. I guess the reason is that it is because the process is not quasistatic. If I am right, why is this process not quasistatic. If I am not, what's wrong with the formula...
Hello, I have to find the density of probability which gives the maximum of the entropy with the following constraint\bar{x} = \int x\rho(x)dx
\int \rho(x) dx = 1
the entropy is : S = -\int \rho(x) ln(\rho(x)) dx
L = -\int \rho(x) ln(\rho(x)) dx - \lambda_1 ( \int \rho(x) dx -1 ) -...
Hello,
In my textbook I read this example:
A gas in an isolated system expands after pulling out a separating plate, so its volume increases and there is no work or heat exchange.
the entropy of if the ideal gas is
$$\Delta S = n R \ln \frac {V_1} {V_2}$$
and the second law of thermodynamics...
When calculating heat transfer, how would one know when to use Q=T*m*(delta s) versus Q=m*(delta h). I'm confused when we should calculate using entropy versus enthalpy. Anything helps. Thank you!
Homework Statement
52 distinguishable particles have been in a box long enough to reach equilibrium. The box is divided into two equal-volume cells. Let's say that there are 103 sub-states (s1 through s1000) available to each particle on each side, regardless of how many other particles are...
I am trained in aeronatical engineering, spent a number of year supporting sounding rockets in the exploration of the Upper Atmosphere. Retired 16 years ago. Am interested in General Relativity and Quantum physics. Looking to explore and understanding of entropy... Black holes... Etc.