Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Moderator's Note: THIS HOMEWORK WAS POSTED IN ANOTHER FORUM, SO THERE IS NO TEMPLATE.
calculate the change in entropy suffered by 2 moles of an ideal gas on being heated from a volume of 100L at 50C to a volume of 150L at 150C. for the gas Cv=7.88 cal/moleC.
I'm really confused in entropy and...
Homework Statement
Compute the entropy change of 5.00 g of water at 100°C as it changes to steam at 100°C under standard pressure.
Homework EquationsThe Attempt at a Solution
5/100+100 is what i did and my teacher says its wrong 'cause its in the formula
How is entropy related to flow velocity? Does entropy change with when flow velocity changes ?
I have a questions to solve with determines pressure, temperature and flow velocity at two points and Flow Direction is required to determine. Please help me with the question
Homework Statement
An ideal diatomic gas is initially at temperature ##T## and volume ##V##. The gas is taken through three reversible processes in the following cycle: adiabatic expansion to the volume ##2V##, constant volume process to the temperature ##T##, isothermal compression to the...
Given enough time Boltzmann freak structures will appear, assembled from drifting matter, in the maximum-entropy universe if it is static, I.E. not expanding to eventually sweep all matter into the far horizon.
In a lecture...
I stumbled upon this article (among many other similar ones) http://arxiv.org/abs/0909.3983, that seems to correct the outdated assumption that the CMB dominates the entropy density of the universe. They find that super massive black holes are actually the primary contribution by many orders of...
I'm trying to measure now much non redundant (actual) information my file contains. Some call this the amount of entropy.
Of course there is the standard p(x) log{p(x)}, but I think that Shannon was only considering it from the point of view of transmitting though a channel. Hence the formula...
Entropy of any system is state dependent. I just read about thermodynamics and I got this question. I first set my system which contained my classroom, the hallway connecting my physics teacher's cabin and the classroom. Then let the entropy of the system be S when I was going to the classroom...
I get that a Lie group like E8 is smoothly differentiable. But as I understand it, it has dimension and structure. Does it support the Idea of a phase space, and an entropy measure for that phase space, or at least for regions of it?
Please help me with my confusion.
My logic:
1. CP violation causes direct T-violation
2. T-violation breaks Kirchhoff's law of thermal radiation (imagine a rock in a thermal bath of T-violating particles, where absorption and emission rates are different). When I say "radiation", I don't mean...
[Moderator's note: Recategorized thread to "Basic".]
While driving alone through the beautiful scenery of Banff and Yoho national parks, a question formed in my mind.
Which of these modes of slowing down a vehicle by an equal amount is likely to minimize the resulting overall increase in...
An old book I have on elementary Statistical Mechanics (Rushbrooke) uses as an especially simple case a system with one energy level. This level is doubly degenerate. The author doesn't give an example of such a system. Can anyone think of one? And would it have entropy k\ ln 2?
[My thoughts...
Hi all,
I was recently watching one of Susskind's 'Theoretical Minimum' lectures in which he says that the entropy of the universe may be measured via the number of observable photons, and that somehow these quantities (photon number and total entropy) are somehow linked. Could anybody with...
Homework Statement
Man with a temperature of 310.15 K and a mass of 70 kg drinks 0.4536 kg of water at 275 K. Ignoring the temperature change of the man from the water intake (assume human body is a reservoir always at same temperature), find entropy increase of entire system.
Homework...
Consider the problem of computing the entanglement entropy of two CFTs in the thermofield double state on identical finite intervals in 1+1 dimensions. The Euclidean path integral is then equivalent to computing the 2-point twist correlator on a torus. Given a central charge ##c##, does anyone...
I have been thinking about finding a way to define entropy on a rubik's cube. My idea is to use the number of cubies that are not in their solved position as the macrostate. This works well because there is exactly one way for all the cubies to be in the solved position so it has entropy of...
I'm sorry for bad english.
I wonder what is the differences between reversible-isothermal-expansion and irreversible-isothermal-expansion.
Is their entropy same?
When I learned the concept of specific heat capacity, I knew that 1J/(K*kg) means that it takes 1 Joule of energy to increase the temperature of a kilogram of matter by one Kelvin, but what does J/K, the unit of entropy, mean?
I'm interested in the derivation of relativistic Boltzmann equation from entropy after reading Scott Dodelson's wonderful cosmology book. Does anyone know of any good readings for this?
The usual way of doing things in classical mechanics is to assume \frac{dN}{dt}= 0 and go from there; but...
So i need a introduction to Heat, temperature and thermodynamics; not a very advanced text, just a clear beginner text that can include math/calculus. I'm just curious about how heat was/is measured, how it's defined and how it works.Also I wan't to understand thermodynamics and entropy and how...
Homework Statement
A 3.00-kg block of silicon at 60.0°C is immersed in 6.00 kg of mercury at 20.0°C. What is the entropy increase of this system as it moves to equilibrium? The specific heat of silicon is 0.17 cal/(g·K) and the specific heat of mercury is 0.033 cal/(g·K).
Homework Equations
Q...
Homework Statement
Adsorption, coagulation, and flocculation are all important processes to remove or separate target substances from mixtures. In these processes, we can observe the spontaneous decrease of entropy.; can we therefore conclude that the second law of thermodynamics doesn't always...
Hello
In relativity, what magnitude is absolute*: temperature or entropy?
*absolute = equal for all observers (= a Lorentz scalar)
Thank you for your time :)
Greetings!
What is the entropy of weak and strong force? Can we determine their entropy? If so, I would like to know the formula of determining this. Thanks:wink:
Hey all. I have a question regarding the solution to a question, both shown below (only part of the solution is shown). Specifically the line that states: H(Y|X) = H(Z|X). Why does this equality hold? Expanding and using the definition of entropy, I can see that for the above equality to hold...
Do systems further away from equilibrium increase entropy faster than a system with a high level of entropy and does this increase push the universe towards thermal equilibrium faster. Is there anything stopping the universe from reaching thermal equilibrium?
Every explanation of this I have read has been extremely poor.
Imagine we have a MONATOMIC gas, with no internal degrees of freedom. The gas is confined to a box of volume V, and this volume is constant and is not allowed to increased upon adding heat energy.
We add an infinitesimal amount...
Hi all,
Just doing some thermo study and am stuck on a question. I am not sure where to start this Q as normally I am given a property at the exit..?
Any help is appreciated.
Entropy has been described to me many different ways. Overall, I understand it as the measurement of randomness or disorder in a system. This doesn't make sense to me. It seems that it is a term made for humans. For example, if we knew all the information about every particle in the universe...
hi all,
this is Entropy Rate Balance for Control Volumes.
in the case of throttling process,
.throttling process is irreversible process, so (S2- S1) must be greater than zero.
what i want to know is to prove that (s2-s1) is greater than zero.
Thank you in advance.
Homework Statement
The evaporation enthalpy of Hg is ##59.3 kJmol^-1## at its boiling temperature ##356.6ºC##. Calculate:
(a) the vaporization entropy of Hg at this temperature
(b) the change in entropy of the surroundings and universe
(c) the vaporization entropy of Hg at 400ºC.I was also...
Homework Statement
I am suppose to calculate the relative entropy between two sets of data:
Base set
Set 1:
A C G T
0 0 0 10
0 0 0 10
0 0 10 0
0 10 0 0
10 0 0 0
* * * * //Randomized
0 0 0 10
0 10 0...
Dear PF Forum,
I have a question to ask,
Can two celestial bodies orbiting each other eternally without external influence? Does this process need energy?
Thanks for any answer
Steven
We all know that in free expansion of gases under adiabatic condition and keeping the temperture constant ,we get that no work is done by the gas.Moreover since del(Q)=0, del(U)=0 according to first law of thermodynamics.But we get a increase of entropy by an amount Rln2.Now the question is that...
Homework Statement
A vessel containing 500 g of water, at a starting temperature of 15.0 ◦C, is placed in a freezer at −10.0 ◦C, and left to freeze. (i) If the heat capacity of the vessel is negligible, show that the total entropy change for the system of the water and the inside of the...
I realize this question has arisen before in the following thread: https://www.physicsforums.com/threads/difference-between-heat-and-work.461711/ but I felt there may be more room for discussion. I feel that the nature of the effect of heat on physical systems is a rather deep one. If the flow...
Homework Statement
1kg of silver is heated by a large heat reservoir at 373 K from 273K. Calculate the change of entropy in:
a) the silver
b) the reservoir
c) the universe.
Homework Equations
ΔS = ∫dQ/T
The Attempt at a Solution
calculating the change in the silver first
ΔS = ∫dQ/T...
Imagine a maximum entropy information system: This system would hold meaningful information, not just random noise, but still be of maximum possible entropy in the sense that you could randomly change the order of the smallest bits of information in it without actually changing the overall...
http://arxiv.org/abs/1503.02981
Four-Dimensional Entropy from Three-Dimensional Gravity
S. Carlip
(Submitted on 10 Mar 2015)
At the horizon of a black hole, the action of (3+1)-dimensional loop quantum gravity acquires a boundary term that is formally identical to an action for three-dimensional...
Homework Statement
[/B]
Problem: In order to simplify your analysis, you will assume alcohol has the same properties of water so you can use the steam tables. You load the 30 gallon still 1/3 full with nearby water at 1 bar and 20°C and mash(assume the mash has negligible influence on...