Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
i don't really understand why S of the universe must be always positive,i know that only reversible process have constant entropy but why real proceses always increase S in the universe?
sorry for bad english I am not from USA or UK
Homework Statement
A gas sample containing 3.00 moles of Helium gas undergoes a state change from 30 degrees celsius and 25.0 L to 45 degrees celsius and 15.0 L. What is the entropy change for the gas (ideal gas)? For He, Cp = 20.8 J/k*mol
Homework Equations
ΔS = Cv*ln(Tf/Ti) + nR*ln(Vf/Vi) =...
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks...
If a reservoir is in thermal contact with a system, why is the entropy simply Q/T ? Shouldn't this equation only valid for reversible process? Why is it reversible?
Homework Statement
[/B]
What is the contribution of the conduction electrons in the molar entropy of a metal with
electronic coefficient of specific heat? I can't figure out how to comprehend this, which relation/theory might lead to this?
and How this answer is relevant to the point of molar...
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard...
I never really understand the concept of entropy through classical thermodynamics. Here are a few questions.
1. The change in entropy dS in an isolated system is always >=0, but how does it imply the system tends to a state with maximum entropy? How to know that there exist a maximum?
2. Why is...
Homework Statement
2.Relevant equations[/B]The Attempt at a Solution
How does a reversible process in the universe imply the entropy doesn't increase? I understand that the change of entropy in a closed reversible cycle is 0 in the system, but I don't get why a not closed reversible process...
Sir Roger Penrose in his book Cycles of Time on page 19 states the result of a calculation of probability of mixing red and blue balls as an illustration of entropy as state counting and the Second Law. He assumes an equal number of each. There is a cube of 10^8 balls on an edge subdivided into...
Many people talking about there are similarities and common positions in quantum entaglement and superposition with entropy. I need to know about this phenomenon
Hello,
The state | W \rangle = \frac { 1 } { \sqrt { 3 } } ( | 001 \rangle + | 010 \rangle + | 100 \rangle ) is entangled.
The Schmidt decomposition is :
What would the Schmidt decomposition be for | W \rangle ?
I am also intersted in writing the reduced density matrix but I need the basis...
1. Homework Statement
if a rigid adiabatic container has a fan inside that provides 15000 j of work to an ideal gas inside the container,
does the change in entropy would be the same as if 15000 j of heat are provided to the same rigid container (removing the insulation)?2. Relevant equations...
My background is that I'm an applied mathematician and engineer, self-taught in GR and QFT. It's an old idea, in some dozen or so SciFi books. But I'm looking for a mathematical framework for handling it. The second law of thermodynamics, that entropy always increases in a closed system, can be...
Imagine there is an radiation concentrator (winston cone) surrounded with extremely many layers of foil for radiation insulation, except at the smaller opening. Every part of the setup is initially in thermal equilibrium with the surroundings. The amount of thermal radiation flowing through the...
Hi,
Could you please help me to clarify a few points to understand entropy intuitively?
Entropy is defined as:
Please have a look at the attachment, "entropy111".
Source of attachment: http://faculty.chem.queensu.ca/people/faculty/mombourquette/chem221/4_secondthirdlaws/SecondLaw.asp
The...
Homework Statement
When the air outside is very cold and dry, your climate control system must humidify the cabaret air so that the singers don't lose their voices. The climate control let's pure water evaporate into the dry air and raises the moisture content of that air. As this evaporation...
Homework Statement
During the fall, the outside air's temperature is comfortable but its humidity is too high for direct use inside the cabaret. The air feels clammy and damp. So your climate control system chills the outdoor air to extract some of its moisture and then reheats that air back up...
From a heuristic standpoint it makes sense that when a system goes from being periodic to chaotic, the occupied volume of the phase space increases (while not violating liouville theorem). Since the volume of phase space is proportional if not equal to the entropy, shouldn’t entropy always...
when selecting rare entropy sources for trng and one can see similarities trough an applied hidden markov model, will it be still good entropy?
(structure is the same, even though type of source input is different)
If we have two sequences s1 and s2, both of N coin tosses, is the entropy of getting two sequences that are exactly the same then lower than sequences of which can be said that they differ by x incident tosses? Is the entropy of getting sequences s1 and s2 that differ by N/2 tosses the highest...
Sometimes I go back and think about this stuff, and I always find something I don't understand very well.
Consider an irreversible isothermal expansion of an ideal gas from state ##A## to state ##B## and suppose I know the amount of heat given to the system to perform the expansion - I'll...
It is said that entropy causes an arrow of time. However, how about the irreversability of a measurement like electron spin. When measured a certain spin, the previous value gets lost. So does that also require an arrow of time?
Hello;
If a system receives a thermal energy Q, can it keep its entropy constant (that is, with equal value before it receives the energy) without wasting the energy received?
I just read a book by nuclear physicist Carlo Rovelli on the subject of "Time" and he says that 'entropy' is the only non-reversible process in the basic equations of physics, and he believes time and entropy are related (if I understand him correctly). So this started me thinking on entropy...
<edit: moved to homework. No template in the post.>
An ice maker inputs liquid water at 25 degrees C and outputs ice at -5 degrees C. Assume there is 1 kg of water and the volume does not change.
Cp liquid 4.18 kJ/kg-K
Cp solid 2.11 kJ/kg-K
∆H fusion 334 kJ/kg
I need to...
Homework Statement
Derive an expression for the change of temperature of a solid material that is compressed adiabatically and reversible in terms of physical quantities.
(The second part of this problem is: The pressure on a block of iron is increased by 1000 atm adiabatically and...
Homework Statement
In a monatomic crystalline solid each atom can occupy either a regular lattice site or an interstitial site. The energy of an atom at an interstitial site exceeds the energy of an atom at a lattice site by an amount ε. Assume that the number of interstitial sites equals the...
In a free expansion, I know that we cannot use the equation dS=dQ/T...(1). Instead we use dS>dQ/T...(2).
The question is that why we can use △S=ncᵥln(T_f/T_i)+nRln(V_f/V_i) , which is derived from the equation(1), to calculate the entropy change? Shouldn’t it be a inequality too?
Is entropy consistent from all reference frames? For an observer at the surface of a black hole, a finite amount of time would pass, but the observer would observe an unbounded amount of time passing for the outside universe, hence from his/her reference frame, information, entropy of the...
Hello,
I am trying to figure out where my reasoning falls apart in this thought experiment:
To determine if a process "A" is reversible (or at the very least internally reversible), I try to picture a reversible process "B" that involves only heat transfer and links the same two endpoints that...
Howdy,
Say you've got two highly reflective mirrors forming a cavity. Some broadband light goes in, but only narrowband light comes out. Entropy is definitely decreased as far as the photons are concerned. Where does it go?
This has been bugging me. I have a partial solution I was hoping you...
In Brazil Nut effect /Granular convection the large grains move upward and the smaller ones go downward. This sorting is supposed to reduce the multiplicity of this system. But according to the second law of thermodynamics, entropy and multiplicity of the system should increase.
I am looking...
Carlo Rovelli described in "The Order of Time" that
"Living beings are made up of similarly intertwined processes. Photosynthesis deposits low entropy from the sun into plants. Animals feed on low entropy by eating. (If all we needed was energy rather than entropy, we would head for the heat of...
Homework Statement
[/B]
Strap in, this one's kind of long. (This problem is from 'Six Ideas That Shaped Physics, Unit T' by Thomas A Moore, 2nd edition. Problem T6R2.)
Imagine that aliens deliver into your hands two identical objects made of substances whose multiplicities increase linearly...
A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy--is there a corresponding increase of entropy elsewhere(environment/observer)?Is there any work done on the system in the act of measurement?
It is sometimes said that entropy is "unlikely" to return to the "pattern" that it came from, for instance: if we have a vat with blue gasmolecules and white gasmolecules separated by a slit, if we remove the slit, the blue and white molecules will mingle, unlikely to return to their separated...
There is something that is unclear to me, and because entropy bounds and their violations were discussed in the other thread, I thought it is a good opportunity to learn something. The problem is essentially a matter of impression. The statements go roughly in the following way: for a system...
Why is there a tendency to use the concept of entropy to explain everything from relativity to quantum mechanics. Why people think this concept is so satisfactory?
Jacob Bekenstein asserts that the entropy of a black hole is proportional to its area rather than its volume. Wow.
After watching Leonard Susskind's video 'The World as a Hologram', it seems to me that he's implying that we are all black hole stuff. Perhaps we (our galaxies and their black...
I have read a bit from the book Cycles of Time(Penrose), and I wondered whether an increase in entropy in one part of the Universe, lead to a decrease in entropy in other parts, and maybe the universe's expansion is an attempt by the Universe to keep entropy at the same level.
And eventually you...
Entropy is always increasing, say the thermodynamicists, and the increase will ultimately caue a "heat death" of the universe.
But gravity seems to contradict this. Gravity, by clumping matter together, always engenders a decrease in entropy. Indeed, some cosmologists propose an eventual...
For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart? Shouldn't the information content be the same for the macrosystem?
The resolution for Maxwell's demon paradox is that the demon has limited memory and the demon will eventually run out of information storage space and must begin to erase the information it has previously gathered. Erasing information is a thermodynamically irreversible process that increases...
Hi again Physics Forums! Last time I was here, I was an undergrad student. Now I almost finished a PhD in quantum information and became a high school teacher.
I've never properly learned thermodynamics. I'm now trying to get it on the same level, that I understand the other topics in classical...
Is there an expression similar to the Sackur-Tetrode equation that describes the statistical entropy of fermions or bosons, maybe for the electron gas in a metal or the photon gas in a cavity?
Homework Statement
I'm attempting to calculate the translational entropy for N2 and I get a value of 207.8 J/Kmol. The tabulated value is given as 150.4 and I am stumped as to why the decrepancy.
T = 298.15 K and P = 0.99 atm and V = 24.8 L
R = 8.314 J/Kmol[/B]
Homework Equations
Strans =...
I have read that the early universe had a very low entropy. I don't understand why. A giant ball of plasma at billions of degrees K with particles moving in all directions. It seems like the definition of total disorder. Why is the entropy considered low?
I find my self quite confused about some aspects of the concept of entropy. I will try to explain my confusion using a sequence of examples.
All of the references I cite are from Wikipedia.
Ex 1. Does the following isolated system have a calculable value for its entropy?
Assume a spherical...