Statistical mechanics or statistical thermodynamics^{[note 1]} is a branch of physics that applies probability theory, which contains mathematical tools for dealing with large populations, to the study of the thermodynamic behavior of systems composed of a large number of particles. Statistical mechanics provides a framework for relating the microscopic properties of individual atoms and molecules to the macroscopic bulk properties of materials that can be observed in everyday life, thereby explaining thermodynamics as a result of the classical and quantummechanical descriptions of statistics and mechanics at the microscopic level.
Statistical mechanics provides a molecularlevel interpretation of macroscopic thermodynamic quantities such as work, heat, free energy, and entropy. It enables the thermodynamic properties of bulk materials to be related to the spectroscopic data of individual molecules. This ability to make macroscopic predictions based on microscopic properties is the main advantage of statistical mechanics over classical thermodynamics. Both theories are governed by the second law of thermodynamics through the medium of entropy. However, entropy in thermodynamics can only be known empirically, whereas in statistical mechanics, it is a function of the probability distribution of the system on its microstates.
Statistical mechanics was initiated in 1870 with the work of Austrian physicist Ludwig Boltzmann, much of which was collectively published in Boltzmann's 1896 Lectures on Gas Theory.^{[1]} Boltzmann's original papers on the statistical interpretation of thermodynamics, the Htheorem, transport theory, thermal equilibrium, the equation of state of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies. The term "statistical thermodynamics" was proposed for use by the American thermodynamicist and physical chemist J. Willard Gibbs in 1902. According to Gibbs, the term "statistical", in the context of mechanics, i.e. statistical mechanics, was first used by the Scottish physicist James Clerk Maxwell in 1871. "Probabilistic mechanics" might today seem a more appropriate term, but "statistical mechanics" is firmly entrenched.^{[2]}
Overview
The essential problem in statistical thermodynamics is to calculate the distribution of a given amount of energy E over N identical systems.^{[3]} The goal of statistical thermodynamics is to understand and to interpret the measurable macroscopic properties of materials in terms of the properties of their constituent particles and the interactions between them. This is done by connecting thermodynamic functions to quantummechanical equations. Two central quantities in statistical thermodynamics are the Boltzmann factor and the partition function.
Fundamentals
Template:World Heritage Encyclopedia books link
Central topics covered in statistical thermodynamics include:
Lastly, and most importantly, the formal definition of entropy of a thermodynamic system from a statistical perspective is called statistical entropy, and is defined as:
 $S\; =\; k\_B\; \backslash ln\; \backslash Omega\; \backslash !$
where k_{B} is Boltzmann's constant 1.38066×10^{−23} J K^{−1} and Ω is the number of microstates corresponding to the observed thermodynamic macrostate.
This equation is valid only if each microstate is equally accessible (each microstate has an equal probability of occurring).
Boltzmann distribution
If the system is large the Boltzmann distribution could be used (the Boltzmann distribution is an approximate result)
 $n\_i\; \backslash propto\; \backslash exp\backslash left(\backslash frac\; \{U\_i\}\{k\_B\; T\}\backslash right),\; \backslash ,$
where n_{i} stands for the number of particles occupying level i or the number of feasible microstates corresponding to macrostate i; U_{i} stands for the energy of i; T stands for temperature; and k_{B} is the Boltzmann constant.
If N is the total number of particles or states, the distribution of probability densities follows:
 $$
\rho _i \equiv \frac {n_i}{N} = \frac {\exp\left(\frac {U_i}{k_B T}\right)} { \sum_j \exp\left(\frac {U_j}{k_B T}\right)},
where the sum in the denominator is over all levels.
History
In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica which laid the basis for the kinetic theory of gases. In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as heat is simply the kinetic energy of their motion.
In 1859, after reading a paper on the diffusion of molecules by Rudolf Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the firstever statistical law in physics.^{[4]} Five years later, in 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his life developing the subject further.
Hence, the foundations of statistical thermodynamics were laid down in the late 1800s by those such as Maxwell, Boltzmann, Max Planck, Clausius, and Josiah Willard Gibbs who began to apply statistical and quantum atomic theory to ideal gas bodies. Predominantly, however, it was Maxwell and Boltzmann, working independently, who reached similar conclusions as to the statistical nature of gaseous bodies. Yet, one must consider Boltzmann to be the "father" of statistical thermodynamics with his 1875 derivation of the relationship between entropy S and multiplicity Ω, the number of microscopic arrangements (microstates) producing the same macroscopic state (macrostate) for a particular system.^{[5]}
Fundamental postulate
The fundamental postulate in statistical mechanics (also known as the equal a priori probability postulate) is the following:
 Given an isolated system in equilibrium, it is found with equal probability in each of its accessible microstates.
This postulate is a fundamental assumption in statistical mechanics  it states that a system in equilibrium does not have any preference for any of its available microstates. Given Ω microstates at a particular energy, the probability of finding the system in a particular microstate is p = 1/Ω.
This postulate is necessary because it allows one to conclude that for a system at equilibrium, the thermodynamic state (macrostate) that could result from the largest number of microstates is also the most probable macrostate of the system.
The postulate is justified in part, for classical systems, by Liouville's theorem (Hamiltonian), which shows that if the distribution of system points through accessible phase space is uniform at some time, it remains so at later times.
Similar justification for a discrete system is provided by the mechanism of detailed balance.
This allows for the definition of the information function (in the context of information theory):
 $$
I =  \sum_i \rho_i \ln\rho_i = \langle \ln \rho \rangle.
When all the probabilities ρ_{i} (rho) are equal, I is maximal, and we have minimal information about the system. When our information is maximal (i.e., one rho is equal to one and the rest to zero, such that we know what state the system is in), the function is minimal.
This information function is the same as the reduced entropic function in thermodynamics.
Mark Srednicki has argued that for quantum mechanical systems, the fundamental postulate can be derived, in the semiclassical limit, assuming only that Berry's conjecture (named after Michael Berry) applies to the system in question.^{[6]}^{[7]} Berry's conjecture is believed to hold only for systems with a chaotic classical limit, and roughly says that the energy eigenstates are distributed as Gaussian random variables. Since all realistic systems with more than a handful of degrees of freedom are expected to be chaotic, this puts the fundamental postulate on firm footing. Berry's conjecture has also been shown to be equivalent to an information theoretic principle of least bias.^{[8]} The generalization of this idea beyond the semiclassical limit is the basis for the Eigenstate thermalization hypothesis,^{[9]} which purports to explain when and why an isolated quantum mechanical system can be accurately described using equilibrium statistical mechanics.
Statistical ensembles
The modern formulation of statistical mechanics is based on the description of the physical system by an ensemble that represents all possible configurations of the system and the probability of realizing each configuration.
Each ensemble is associated with a partition function that, with mathematical manipulation, can be used to extract values of thermodynamic properties of the system. According to the relationship of the system to the rest of the universe, one of three general types of ensembles may apply, in order of increasing complexity:
 Microcanonical ensemble: describes a completely isolated system, having constant energy, as it does not exchange energy or mass with the rest of the universe.
 Canonical ensemble: describes a system in thermal equilibrium with its environment. It may only exchange energy in the form of heat with the outside.
 Grandcanonical ensemble: used in open systems which exchange energy and mass with the outside.
Summary of ensembles

Ensembles

Microcanonical

Canonical

Grand canonical

Variables (suppressed
constant for ensemble)

E, N, V

T, N, V

T, μ, V

Microscopic features


 Canonical partition function
 $Z\; =\; \backslash sum\_k\; e^\{\backslash beta\; E\_k\}$

 Grand canonical partition function
 $\backslash Xi\; =\; \backslash sum\_k\; e^\{\; \backslash beta\; (E\_k\; \; \backslash mu\; N\_k\; )\; \}$

Macroscopic function

$S\; =\; k\_B\; \backslash ln\; \backslash Omega$

$F\; =\; \; k\_B\; T\; \backslash ln\; Z$

$F\; \; \backslash mu\; N\; =\; k\_B\; T\; \backslash ln\; \backslash Xi$

Microcanonical ensemble
In the microcanonical ensemble, the number of particles, the system's volume, and the system's energy (N, V and E) are fixed. Since the second law of thermodynamics applies to isolated systems, the first case investigated will correspond to this case. The Microcanonical ensemble describes an isolated system.
The entropy of such a system can only increase, so that the maximum of its entropy corresponds to an equilibrium state for the system.
Because an isolated system keeps a constant energy, the total energy of the system does not fluctuate. Thus, the system can access only those of its microstates that correspond to a given value E of the energy. The internal energy of the system is then strictly equal to its energy.
Let Ω(E) be the number of microstates corresponding to this value of the system's energy. The macroscopic state of maximal entropy for the system is the one in which all microstates are equally likely to occur, with probability 1/Ω(E), during the system's fluctuations, and we have for the system's entropy:
 $$
S=k_B\sum_{i=1}^{\Omega (E)} \left[ {1\over{\Omega (E)}} \ln{1\over{\Omega (E)}} \right ] =k_B\ln \left(\Omega (E) \right)
Canonical ensemble
In canonical ensemble the number of particles, the system's volume and the system's temperature, N, V and T, are fixed. Invoking the concept of the canonical ensemble, it is possible to derive the probability P_{i} that a macroscopic system in thermal equilibrium with its environment, will be in a given microstate with energy E_{i} according to the Boltzmann distribution:
 $P\_i\; =\; \{e^\{\backslash beta\; E\_i\}\backslash over\{\backslash sum\_\{j=1\}^\{j\_\{\backslash rm\; max\}\}e^\{\backslash beta\; E\_j\}\}\}$
using the useful definition $\backslash beta=\{1\backslash over\{k\_B\; T\}\}$, known as the Thermodynamic beta or inverse temperature.
The temperature T arises from the fact that the system is in thermal equilibrium with its environment. The probabilities of the various microstates must add to one, and the normalization factor in the denominator is the canonical partition function:
 $Z\; =\; \backslash sum\_\{i=1\}^\{i\_\{\backslash rm\; max\}\}\; e^\{\backslash beta\; E\_i\}$
where E_{i} is the energy of the i^{th} microstate of the system. The partition function is a measure of the number of states accessible to the system at a given temperature. The article canonical ensemble contains a derivation of Boltzmann's factor and the form of the partition function from first principles.
To sum up, the probability of finding a system at temperature T in a particular state with energy E_{i} is
 $P\_i\; =\; \backslash frac\{e^\{\backslash beta\; E\_i\}\}\{Z\}.$
Thus the partition function looks like the weight factor for the ensemble.
Thermodynamic connection
The partition function can be used to find the expected (average) value of any microscopic property of the system, which can then be related to macroscopic variables. For instance, the expected value of the microscopic energy E is interpreted as the microscopic definition of the thermodynamic variable internal energy U, and can be obtained by taking the derivative of the partition function with respect to the temperature. Indeed,
 $\backslash langle\; E\backslash rangle=\{\backslash sum\_i\; E\_i\; e^\{\backslash beta\; E\_i\}\backslash over\; Z\}=\{1\; \backslash over\; Z\}\; \{dZ\; \backslash over\; d\backslash beta\}$
implies, together with the interpretation of $\backslash langle\; E\backslash rangle$ as U, the following microscopic definition of internal energy:
 $U\backslash colon\; =\; \{d\backslash ln\; Z\backslash over\; d\; \backslash beta\}.$
The entropy can be calculated by (see Shannon entropy)
 $\{S\backslash over\; k\}\; =\; \; \backslash sum\_i\; p\_i\; \backslash ln\; p\_i\; =\; \backslash sum\_i\; \{e^\{\backslash beta\; E\_i\}\backslash over\; Z\}(\backslash beta\; E\_i+\backslash ln\; Z)\; =\; \backslash ln\; Z\; +\; \backslash beta\; U$
which implies that
 $\backslash frac\{\backslash ln(Z)\}\{\backslash beta\}\; =\; U\; \; TS\; =\; F$
is the Helmholtz free energy of the system or in other words,
 $Z=e^\{\backslash beta\; F\}.\backslash ,$
Having microscopic expressions for the basic thermodynamic potentials U (internal energy), S (entropy) and F (free energy) is sufficient to derive expressions for other thermodynamic quantities. The basic strategy is as follows. There may be an intensive or extensive quantity that enters explicitly in the expression for the microscopic energy E_{i}, for instance magnetic field (intensive) or volume (extensive). Then, the conjugate thermodynamic variables are derivatives of the internal energy. The macroscopic magnetization (extensive) is the derivative of U with respect to the (intensive) magnetic field, and the pressure (intensive) is the derivative of U with respect to volume (extensive).
The treatment in this section assumes no exchange of matter (i.e. fixed mass and fixed particle numbers). However, the volume of the system is variable which means the density is also variable.
This probability can be used to find the average value, which corresponds to the macroscopic value, of any property, J, that depends on the energetic state of the system by using the formula:
 $\backslash langle\; J\; \backslash rangle\; =\; \backslash sum\_i\; p\_i\; J\_i\; =\; \backslash sum\_i\; J\_i\; \backslash frac\{e^\{\backslash beta\; E\_i\}\}\{Z\}$
where $\backslash langle\; J\; \backslash rangle$ is the average value of property J. This equation can be applied to the internal energy, U:
 $U\; =\; \backslash sum\_i\; E\_i\; \backslash frac\{e^\{\backslash beta\; E\_i\}\}\{Z\}.$
Subsequently, these equations can be combined with known thermodynamic relationships between U and V to arrive at an expression for pressure in terms of only temperature, volume and the partition function. Similar relationships in terms of the partition function can be derived for other thermodynamic properties as shown in the following table; ^{[note 2]}
Helmholtz free energy:

$F\; =\; \; \{\backslash ln\; Z\backslash over\; \backslash beta\}$

Internal energy:

$U\; =\; \backslash left(\; \backslash frac\{\backslash partial\backslash ln\; Z\}\{\backslash partial\backslash beta\}\; \backslash right)\_\{N,V\}$

Pressure:

$P\; =\; \backslash left(\{\backslash partial\; F\backslash over\; \backslash partial\; V\}\backslash right)\_\{N,T\}=\; \{1\backslash over\; \backslash beta\}\; \backslash left(\; \backslash frac\{\backslash partial\; \backslash ln\; Z\}\{\backslash partial\; V\}\; \backslash right)\_\{N,T\}$

Entropy:

$S\; =\; k\; (\backslash ln\; Z\; +\; \backslash beta\; U)\backslash ,$

Gibbs free energy:

$G\; =\; F+PV=\{\backslash ln\; Z\backslash over\; \backslash beta\}\; +\; \{V\backslash over\; \backslash beta\}\; \backslash left(\; \backslash frac\{\backslash partial\; \backslash ln\; Z\}\{\backslash partial\; V\}\backslash right)\_\{N,T\}$

Enthalpy:

$H\; =\; U\; +\; PV\backslash ,$

Constant volume heat capacity:

$C\_V\; =\; \backslash left(\; \backslash frac\{\backslash partial\; U\}\{\backslash partial\; T\}\; \backslash right)\_\{N,V\}$

Constant pressure heat capacity:

$C\_P\; =\; \backslash left(\; \backslash frac\{\backslash partial\; H\}\{\backslash partial\; T\}\; \backslash right)\_\{N,P\}$

Chemical potential:

$\backslash mu\_i\; =\; \{1\backslash over\; \backslash beta\}\; \backslash left(\; \backslash frac\{\backslash partial\; \backslash ln\; Z\}\{\backslash partial\; N\_i\}\; \backslash right)\_\{T,V,N\}$

To clarify, this is not a grand canonical ensemble.
It is often useful to consider the energy of a given molecule to be distributed among a number of modes. For example, translational energy refers to that portion of energy associated with the motion of the center of mass of the molecule. Configurational energy refers to that portion of energy associated with the various attractive and repulsive forces between molecules in a system. The other modes are all considered to be internal to each molecule. They include rotational, vibrational, electronic and nuclear modes. If we assume that each mode is independent (a questionable assumption) the total energy can be expressed as the sum of each of the components:
 $E\; =\; E\_t\; +\; E\_c\; +\; E\_n\; +\; E\_e\; +\; E\_r\; +\; E\_v,\backslash ,$
where the subscripts t, c, n, e, r, and v correspond to translational, configurational, nuclear, electronic, rotational and vibrational modes, respectively. The relationship in this equation can be substituted into the very first equation to give:
 $Z\; =\; \backslash sum\_i\; e^\{\backslash beta(E\_\{ti\}\; +\; E\_\{ci\}\; +\; E\_\{ni\}\; +\; E\_\{ei\}\; +\; E\_\{ri\}\; +\; E\_\{vi\})\}$
 $=\; \backslash sum\_i$
e^{\beta E_{ti}}
e^{\beta E_{ci}}
e^{\beta E_{ni}}
e^{\beta E_{ei}}
e^{\beta E_{ri}}
e^{\beta E_{vi}}.
If we can assume all these modes are completely uncoupled and uncorrelated, so all these factors are in a probability sense completely independent, then
 $Z\; =\; Z\_t\; Z\_c\; Z\_n\; Z\_e\; Z\_r\; Z\_v.\backslash ,$
Thus a partition function can be defined for each mode. Simple expressions have been derived relating each of the various modes to various measurable molecular properties, such as the characteristic rotational or vibrational frequencies.
Expressions for the various molecular partition functions are shown in the following table.
Nuclear

$Z\_n\; =\; 1\; \backslash qquad\; (T\; <\; 10^8\; K)$

Electronic

$Z\_e\; =\; W\_0\; e^\{kT\; D\_e\}\; +\; W\_1\; e^\{\backslash theta\_\{e1\}/T\}\; +\; \backslash cdots$

Vibrational

$Z\_v\; =\; \backslash prod\_j\; \backslash frac\{e^\{\backslash theta\_\{vj\}\; /\; 2T\}\}\{1e^\{\backslash theta\_\{vj\}\; /\; T\}\}$

Rotational (linear)

$Z\_r\; =\; \backslash frac\{T\}\{\backslash sigma\backslash theta\_r\}$

Rotational (nonlinear)

$Z\_r\; =\; \backslash frac\{1\}\{\backslash sigma\}\backslash sqrt\{\backslash frac$

Translational

$Z\_t\; =\; \backslash frac\{(2\; \backslash pi\; mkT)^\{3/2\}\}\{h^3\}$

Configurational (ideal gas)

$Z\_c\; =\; V\backslash ,$

These equations can be combined with those in the first table to determine the contribution of a particular energy mode to a thermodynamic property. For example the "rotational pressure" could be determined in this manner. The total pressure could be found by summing the pressure contributions from all of the individual modes, i.e.:
 $P\; =\; P\_t\; +\; P\_c\; +\; P\_n\; +\; P\_e\; +\; P\_r\; +\; P\_v.\backslash ,$
Grand canonical ensemble
In a grand canonical ensemble, the system's volume, temperature and chemical potential, V, T, μ, are fixed. If the system under study is an open system (in which matter can be exchanged with the environment), but particle number is not conserved, we would have to introduce chemical potentials, μ_{j}, j = 1,...,n and replace the canonical partition function with the grand canonical partition function:
 $\backslash Xi(V,T,\backslash mu)\; =\; \backslash sum\_i\; \backslash exp\backslash left[\backslash beta\; \backslash left(\backslash sum\_\{j=1\}^n\; \backslash mu\_j\; N\_\{ij\}E\_i\backslash right\; )\backslash right]$
where N_{ij} is the number of j^{th} species particles in the i^{th} configuration. Sometimes, we also have other variables to add to the partition function, one corresponding to each conserved quantity. Most of them, however, can be safely interpreted as chemical potentials. In most condensed matter systems, things are nonrelativistic and mass is conserved. However, most condensed matter systems of interest also conserve particle number approximately (metastably) and the mass (nonrelativistically) is none other than the sum of the number of each type of particle times its mass. Mass is inversely related to density, which is the conjugate variable to pressure. For the rest of this article, we will ignore this complication and pretend chemical potentials don't matter.
Let's rework everything using a grand canonical ensemble this time. The volume is left fixed and does not figure in at all in this treatment. As before, j is the index for those particles of species j and i is the index for microstate i:
 $U\; =\; \backslash sum\_i\; E\_i\; \backslash frac\{\backslash exp(\backslash beta\; (E\_i\backslash sum\_j\; \backslash mu\_j\; N\_\{ij\}))\}\{\backslash Xi\}$
 $N\_j\; =\; \backslash sum\_i\; N\_\{ij\}\; \backslash frac\{\backslash exp(\backslash beta\; (E\_i\backslash sum\_j\; \backslash mu\_j\; N\_\{ij\}))\}\{\backslash Xi\}.$
Grand potential:

$\backslash Phi\_\{G\}\; =\; \; \{\backslash ln\; \backslash Xi\backslash over\; \backslash beta\}$

Internal energy:

$U\; =\; \backslash left(\; \backslash frac\{\backslash partial\backslash ln\; \backslash Xi\}\{\backslash partial\backslash beta\}\; \backslash right)\_\{\backslash mu\}+\backslash sum\_i\{\backslash mu\_i\backslash over\backslash beta\}\backslash left(\{\backslash partial\; \backslash ln\; \backslash Xi\backslash over\; \backslash partial\; \backslash mu\_i\}\backslash right\; )\_\{\backslash beta\}$

Particle number:

$N\_i=\{1\backslash over\backslash beta\}\backslash left(\{\backslash partial\; \backslash ln\; \backslash Xi\backslash over\; \backslash partial\; \backslash mu\_i\}\backslash right)\_\backslash beta$

Entropy:

$S\; =\; k\; (\backslash ln\; \backslash Xi\; +\; \backslash beta\; U\; \backslash beta\; \backslash sum\_i\; \backslash mu\_i\; N\_i)\backslash ,$

Helmholtz free energy:

$F\; =\; \backslash Phi\_\{G\}+\backslash sum\_i\; \backslash mu\_i\; N\_i=\{\backslash ln\; \backslash Xi\backslash over\; \backslash beta\}\; +\backslash sum\_i\{\backslash mu\_i\backslash over\; \backslash beta\}\; \backslash left(\; \backslash frac\{\backslash partial\; \backslash ln\; \backslash Xi\}\{\backslash partial\; \backslash mu\_i\}\backslash right)\_\{\backslash beta\}$

Equivalence between descriptions at the thermodynamic limit
All of the above descriptions differ in the way they allow the given system to fluctuate between its configurations.
In the microcanonical ensemble, the system exchanges no energy with the outside world, and is therefore not subject to energy fluctuations; in the canonical ensemble, the system is free to exchange energy with the outside in the form of heat.
In the thermodynamic limit, which is the limit of large systems, fluctuations become negligible, so that all these descriptions converge to the same description. In other words, the macroscopic behavior of a system does not depend on the particular ensemble used for its description.
Given these considerations, the best ensemble to choose for the calculation of the properties of a macroscopic system is that ensemble which allows the result to be derived most easily.
Classical thermodynamics vs. statistical thermodynamics
As an example, from a classical thermodynamics point of view one might ask what is it about a thermodynamic system of gas molecules, such as ammonia NH_{3}, that determines the free energy characteristic of that compound? Classical thermodynamics does not provide the answer. If, for example, we were given spectroscopic data, of this body of gas molecules, such as bond length, bond angle, bond rotation, and flexibility of the bonds in NH_{3} we should see that the free energy could not be other than it is. To prove this true, we need to bridge the gap between the microscopic realm of atoms and molecules and the macroscopic realm of classical thermodynamics. From physics, statistical mechanics provides such a bridge by teaching us how to conceive of a thermodynamic system as an assembly of units. More specifically, it demonstrates how the thermodynamic parameters of a system, such as temperature and pressure, are interpretable in terms of the parameters descriptive of such constituent atoms and molecules.^{[10]}
In a bounded system, the crucial characteristic of these microscopic units is that their energies are quantized. That is, where the energies accessible to a macroscopic system form a virtual continuum of possibilities, the energies open to any of its submicroscopic components are limited to a discontinuous set of alternatives associated with integral values of some quantum number.
See also
Notes
References
 Bibliography
Further reading
External links
 Stanford Encyclopedia of Philosophy.
 Sklogwiki  Thermodynamics, statistical mechanics, and the computer simulation of materials. SklogWiki is particularly orientated towards liquids and soft condensed matter.
 Statistical Thermodynamics  Historical Timeline
 Thermodynamics and Statistical Mechanics by Richard Fitzpatrick
 Lecture Notes in Statistical Mechanics and Mesoscopics by Doron Cohen
Template:Physicsfooter
fr:Physique statistique
This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.