Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
I'm trying to relate an analogy from Brian Greene about entropy microstates/macrostates to the real world. In the analogy, you have 100 coins that you flip. The microstate is which particular coins landed heads up. The macrostate is the total number of coins that are heads up. So a low entropy...
Hi, I am currently reading Introduction to statistical physics by Huang. In the section of entropy, it reads
But what if I choose ##R-P## as a closed cycle? Then in a similar process, I should have ##\int_{R} \frac {dQ} {T} \leq \int_{P} \frac {dQ} {T}## and ##S \left ( B \right ) - S \left (...
Attempt at a Solution:
Heat Absorbed By The System
By the first law of thermodynamics,
dU = dQ + dW
The system is of fixed volume and therefore mechanically isolated.
dW = 0
Therefore
dQ = dU
The change of energy of the system equals the change of energy of the gas plus the change of energy...
The von Neumann entropy for an observable can be written ##s=-\sum\lambda\log\lambda##, where the ##\lambda##'s are its eigenvalues. So suppose you have two different pvm observables, say ##A## and ##B##, that both represent the same resolution of the identity, but simply have different...
I want to know what are the QFT topics that I need to understand in order to proceed in reading papers on entanglement entropy such as,
Entanglement Entropy and Quantum Field Theory
Entanglement entropy in free quantum field theory
Entanglement entropy: holography and renormalization group
An...
the entropy change for a reversible adiabatic process is zero as it remains constant. Is this a reversible process?
assuming T1>T2:
hot (h) water has mass M, temp T1
cold (c) water has mass nM, temp T2
let the final temperature be Tf
if δQ=0 as the process is adiabatic, |Qh|=|Qc| so Qh=-Qc...
ΔU_A + ΔU_B = 0 (Is this because of isolated system am I right?)
ΔU_A = CA * (T_final - T_A )
ΔU_B=CB * (T_final-T_B)
And because of a very slow process : S=ln(T)
T_final= (CA T_A + CB T_B)/(CA + CB)
ΔS_final = CA*ln(T_f/TA) + ln(T_f/TB) * CB
My QUESTION is :
When we say No heat exchange...
In a shrinking universe heat will increase, but also volume available to place particles will decrease. What happens to entropy when the volume gets very small and the temperature is very high?
If ##N## is constant (per the partial derivatives definitions/ the subscripts after the derivatives) then ##G## is constant
##H - TS = constant##
Taking the derivative of both sides with respect to ##T## while holding ##N,P## constant we get the following with the use of the product rule...
I was studying statistical mechanics when I came to know about the Boltzmann's entropy relation, ##S = k_B\ln Ω##.
The book mentions ##Ω## as the 'thermodynamic probability'. But, even after reading, I can't understand what it means. I know that in a set of ##Ω_0## different accessible states...
In a (reversible) Carnot cycle the entropy increase of the system during isothermal expansion at temperature TH is the same as its decrease during isothermal compression at TC. We can conclude that the entropy change of the system is zero after a complete Carnot cycle.
The mentioned textbook now...
In heat engine we define a heat source from where heat is transferred to the system, we say that heat source has a temperature ##T_h## , When we define a Carnot heat engine, the first process we have is an isothermal expansion and we say heat has to come in system through this process and here...
Planck states that all perfect crystalline system have the same entropy in limit T approaches zero,so we can put the entropy equal zero.Can we demonstrate that or is it only a presumption?
How to calculate entropy from positions and velocities of gas molecules?
lets say we have 2 different gases. entropy should be bigger after mixing them, than before when these are separated. But how to calculate exact entropies by knowing only positions and velocities of gas molecules?
A short background: My question focuses solely on the part of the refrigeration cycle to do with the compressor, where the cycle begins. The first state is before the refrigerant enters the compressor, and the second state is after the refrigerant leaves the compressor. My goal is to obtain...
Hello everyone,
I have to write a paper about entropy and how it relates to the laws of thermodynamics, energy, and work. I have taken a deductive approach starting from the zero-th law to the second law of thermodynamics as follows.
Entropy is the disorder of a system (Class Video, 2019)...
I'm kinda confused on the concept of entropy of everyday, low entropy states like macroscopic objects. It is said that the entropy is a measure of disorder, or distinguishability between macroscopic states.
Can two objects which are macroscopically distinguishable/look different have the same...
According to Everett-interpretation or many world interpretation of quantum mechanics, each decision an observer makes, the world splits into two parallel universes, let’s say an observer in some point in Spacetime is tests the Schrödinger’s cat experiment, in one branch of the universe the cat...
for a)##\Delta S=\mp \int_{T_i}^{T_0}\frac{C(T)}{T}dT## and ##\Delta S_{th}=\int_{T_i}^{T_0}\frac{dQ}{T_0}dT## so ##S_{univ}=\Delta S_{th}+\Delta S##.
What is ##dQ## equal to ? I don't know how to answer question b).
Thank you for your help.
If you take a system with fixed entropy S0 and let it evolve, it reachs equilibrium. Let Ueq be the energy of the system at equilibrium.
Now take the same system with fixed energy U=Ueq (S is not fixed anymore), how do you know that the equilibrium reached is the same as before, that means with...
I know that the entropy of a system is the same in different inertial frames. Is this still the case for non inertial frames? For example, is the entropy of a body as seen from a rotating reference frame the same as the entropy seen from a fixed frame?
A hypothetical question. Heat Q is transferred from water to a metallic solid. Both have same heat capacities and the same initial temperature. Now since molecules in a liquid are more randomly oriented than a solid, will the entropy decrease of the liquid be more than the entropy increase of...
Why when we supply energy for biological body then the body can keep entropy not increase?Because we know that by definition temperature equals partial derivative of internal energy to entropy.So that when temperature being constant,if internal energy increase(supplying energy for body) then...
Dear admins and moderators,
I am sure this subject has come up many times before, and could well be a stupid question. If so, could you direct me to the relevant thread(s)?
Setting aside its itinerant electron, (Hydrogen Atom,) the Proton is THE building block of the Universe.
"...Despite...
Hi.
Processes involving a friction force whose direction somehow depends on the direction of the velocity, such as ##\vec{F}=-\mu\cdot\vec{v}##, aren't symmetric with respect to time reversal. If you play it backwards, this force would be accelerating.
On the other hand, friction dissipates...
What is the entropy change of the system in the Gibbs Free Energy Equation?
The general expression for entropy change is ΔS=q/T
The only exchange between the system and the surroundings is ΔH done reversibly, with no PV work and no matter transfer, therefore
q(syst) = ΔH(syst)
therefore surely...
The text says:
"Steel bullet of 25kg with a Temperature of 400 Celsius, is being dropped on the bottom of an oil liquid of 100kg at a temperature of 100 Celsius. The system is isolated. Calculate
a) The change of entropy of the bullet,
b) the change of entropy of the oil,
c) the total change of...
What about if we allow for a temperature and volume change in a solid or a liquid?
Would the entropy change still only depend on the temperature change or also on the volume change.
For a solid I would think that the volume change doesn't matter since it doesn't change the "amount of disorder"...
Problem Statement: 1 kg of water at 273 K is brought into contact with a heat reservoir at 373 K. When the water has reached 373 K, what is the entropy change of the water, of the heat reservoir, and of the universe?
Relevant Equations: dS=Cp*(dT/T)-nR*(dP/P)
dS=Cv*(dT/T)+nR*(dV/V)
I am...
In a recent study (https://phys.org/news/2018-08-flaw-emergent-gravity.html) it has been discovered an important flaw in Emergent/Entropic Gravity because it has been discovered that holographic screens cannot behave according to thermodynamics...
But then, doesn't this also invalidate...
The multiplicity of states for a particle in a box is proportional to the product of the volume of the box and the surface area of momentum space.
$$ \Omega = V_{volume}V_{momentum}$$
The surface area in momentum space is given by the equation:
$$p^{2}_{x}+ {p}^{2}_{y}+{p}^{2}_{z} =...
Although I've read many papers that propose a relation between action and entropy, I've been told that there is no generally accepted relation in physics.
But how/why are these concepts unrelated?
What about nobel laureate Frank Wilczek? He proposes that entropy and action are closely related...
So I get into these discussions on other ... less scientific ... fora, and then run into trouble and have to come here for correct answers.
I state these as assumptions but they are really questions. Please correct.
Entropy is usually applied in a thermodynamics context, but it can be applied...
If we reversed the second law of thermodynamics, and entropy decreased, and also managed to reverse the motion for all matter in the universe, so that in 24 hours, everything would return to the state it was at then, would time be said to be flowing backwards? Or would time still be flowing...
Hi,
consider an adiabatic irreversible process carrying a thermodynamic system from initial state A to final state B: this process is accompanied by a positive change in system entropy (call it ##S_g##). Then consider a reversible process between the same initial and final system state. Such...
I tried following:
$$ dS_{\text{total}} = |\frac{dQ}{T_c}| |\frac{dQ}{T_H}| $$
where ## T_h ## is temperature of hot water and ## T_c ## is temperature of cold water. Coefficient for water wasn't provided in the assignment so i used following value c = 4190 J/kgK.
$$ dS_{\text{total}} =...
Help!
Hi, I need
in the secodn law of thermodynamic, we have the ENTROPY "S".
Well, I need help for this:
We have dS ≈ dQ
Then we have dS = λ *dQ
where we have λ = λ (T, ... )
I have to demostrate that :
λ = 1/T , where T = temperature.
Thanks for the advices and help!
I saw another post about dS = dQ/T, but the subject of question was different - not related to the entropy of universe.
This is what i understand from this formula:
As the temperature goes down, the entropy goes up. Is this not the opposite (contradictory) to what entropy (disorder) is about...
My understanding is that to define the entropy of a system what you have to do is as follows:
Define the boundaries of your system.
Define a set of "microstates" of the system.
Define a partition of microstates of the system where each element of the partition is measurable and known as a...
Hi, I am also having problem with the assumption Entropy makes about the movement of molecules in a 3D space. Does it assume that gas molecules could have equal chance of going in any direction? If so then how is it possible outside a free-fall lab as Gravity bias would always make all the...
For {pi}i=1,n being the probability distribution, I want to show that a Huffman reduction cannot be decreasing, and I reached a point where I need to show that
q+H(p1,...,pn-s,q) ≥ H(p1,...,pn), where
q = pn-s+1 + ... + pn and s is chosen such that: 2 ≤ s ≤ m (m≥n) and s = n (mod(m-1))
where...
This is my first post and I need to preface my question by saying I have no physics background, so I'm genuinely asking for help in understanding.
A thought occurred to me about the continuing expansion and acceleration of the universe and I'm asking for your help in understanding where my...
Hey guys, so I am reading this book and on pages 89-90, the author says:
"Increasing temperature correspond to a decreasing slope on Entropy vs Energy graph", then a sample graph is provided, and both in that graph and in the numerical analysis given in page 87 the slope is observed to be an...