Boltzmann distribution

In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution) is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:





p

i




e




ε

i




/


k
T





{\displaystyle p_{i}\propto e^{-{\varepsilon _{i}}/{kT}}}
where pi is the probability of the system being in state i, εi is the energy of that state, and a constant kT of the distribution is the product of Boltzmann's constant k and thermodynamic temperature T. The symbol






{\textstyle \propto }
denotes proportionality (see § The distribution for the proportionality constant).
The term system here has a very wide meaning; it can range from a single atom to a macroscopic system such as a natural gas storage tank. Because of this the Boltzmann distribution can be used to solve a very wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied .
The ratio of probabilities of two states is known as the Boltzmann factor and characteristically only depends on the states' energy difference:







p

i



p

j




=

e


(

ε

j




ε

i


)


/


k
T





{\displaystyle {\frac {p_{i}}{p_{j}}}=e^{{(\varepsilon _{j}-\varepsilon _{i})}/{kT}}}
The Boltzmann distribution is named after Ludwig Boltzmann who first formulated it in 1868 during his studies of the statistical mechanics of gases in thermal equilibrium. Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"
The distribution was later investigated extensively, in its modern generic form, by Josiah Willard Gibbs in 1902.The generalized Boltzmann distribution is a sufficient and necessary condition for the equivalence between the statistical mechanics definition of entropy (The Gibbs entropy formula



S
=


k


B






i



p

i


log


p

i




{\textstyle S=-k_{\mathrm {B} }\sum _{i}p_{i}\log p_{i}}
) and the thermodynamic definition of entropy (



d
S
=



δ

Q

rev



T




{\textstyle dS={\frac {\delta Q_{\text{rev}}}{T}}}
, and the fundamental thermodynamic relation).The Boltzmann distribution should not be confused with the Maxwell–Boltzmann distribution. The former gives the probability that a system will be in a certain state as a function of that state's energy; in contrast, the latter is used to describe particle speeds in idealized gases.

View More On Wikipedia.org
  • 60

    Greg Bernhardt

    A PF Singularity From USA
    • Messages
      19,447
    • Media
      227
    • Reaction score
      10,036
    • Points
      1,237
  • 2

    rogdal

    A PF Quark
    • Messages
      14
    • Reaction score
      2
    • Points
      3
  • 1

    transmini

    A PF Electron
    • Messages
      81
    • Reaction score
      1
    • Points
      15
  • 1

    Philip Koeck

    A PF Cell From Stockholm
    • Messages
      677
    • Reaction score
      186
    • Points
      112
  • 1

    MikeGG

    A PF Quark
    • Messages
      1
    • Reaction score
      0
    • Points
      1
  • 1

    Earthland

    A PF Electron
    • Messages
      28
    • Reaction score
      0
    • Points
      11
  • 1

    RaulTheUCSCSlug

    A PF Atom From Santa Cruz, California
    • Messages
      179
    • Reaction score
      27
    • Points
      27
  • 1

    Kilian Stenning

    A PF Quark
    • Messages
      2
    • Reaction score
      0
    • Points
      1
  • 1

    Konte

    A PF Molecule
    • Messages
      90
    • Reaction score
      1
    • Points
      58
  • 1

    danyull

    A PF Electron From U.S.
    • Messages
      9
    • Reaction score
      1
    • Points
      13
  • Back
    Top