Entropy is a fundamental concept in physics, chemistry, information theory, and many other fields. It measures the amount of disorder, randomness, or uncertainty in a system.
What is Entropy?
-
In thermodynamics, entropy is a measure of the number of possible microscopic configurations (microstates) that correspond to a system's macroscopic state.
-
It quantifies the degree of disorder or randomness in a physical system.
-
Entropy is often associated with the second law of thermodynamics, which states that the total entropy of an isolated system can never decrease over time — meaning natural processes tend to move towards more disorder.
Thermodynamic Definition
-
For a reversible process, the change in entropy ΔS\Delta S is defined as:
ΔS=QrevT\Delta S = \frac{Q_{\text{rev}}}{T}where:
-
QrevQ_{\text{rev}} is the heat added reversibly to the system
-
TT is the absolute temperature (in kelvin)
-
-
The unit of entropy is joules per kelvin (J/K).
Statistical Mechanics Interpretation
-
Ludwig Boltzmann connected entropy to probability. He gave the famous formula:
S=kBlnΩS = k_B \ln \Omegawhere:
-
SS is entropy
-
kBk_B is Boltzmann's constant (1.38×10−23 J/K1.38 \times 10^{-23} \, \text{J/K})
-
Ω\Omega is the number of microstates (possible microscopic arrangements) corresponding to the macrostate
-
-
This means entropy measures how many ways the particles in a system can be arranged without changing its overall appearance.
Information Theory
-
In information theory, entropy measures the uncertainty or information content in a message or random variable.
-
Introduced by Claude Shannon, the entropy HH of a discrete random variable XX with possible values xix_i and probabilities pip_i is:
H(X)=−∑ipilog2piH(X) = - \sum_i p_i \log_2 p_i -
This entropy tells how much information (in bits) is expected per message symbol.
Why is Entropy Important?
-
Entropy helps us understand why certain physical processes are irreversible (e.g., why heat flows from hot to cold).
-
It underpins the arrow of time — the direction in which time flows corresponds to increasing entropy.
-
In chemistry, entropy influences reaction spontaneity combined with enthalpy (Gibbs free energy).
-
In computing and data compression, entropy helps optimize encoding.
Examples of Entropy
-
Ice melting into water: entropy increases because water molecules become more disordered.
-
Mixing two gases: entropy increases as particles distribute more randomly.
-
A shuffled deck of cards has higher entropy than an ordered deck.