What is entropy

Last updated: April 1, 2026

Quick Answer: Entropy is a thermodynamic measure of disorder and randomness in a physical system; the second law of thermodynamics states that entropy in an isolated system always increases over time.

Key Facts

Definition of Entropy

Entropy is a fundamental concept in thermodynamics and information theory that measures disorder, randomness, or the unavailability of energy in a system. Derived from the Greek word "en" (in) and "trope" (turning), entropy literally means "a turning inward." In physical sciences, it quantifies how much thermal energy is unavailable for converting to mechanical work in a thermodynamic process.

Thermodynamic Entropy

In thermodynamics, entropy is a state function denoted by the symbol S and measured in joules per kelvin (J/K). It represents the degree of disorder at the molecular level within a substance. As temperature increases, molecular motion increases, raising entropy. The second law of thermodynamics, one of the most important principles in physics, states that the total entropy of an isolated system always increases over time, moving toward maximum entropy (equilibrium).

The Second Law of Thermodynamics

This fundamental law explains why natural processes are irreversible and why time appears to move in one direction. A hot cup of coffee naturally cools to room temperature, but the reverse never occurs spontaneously. This directionality arises because the entropy increase from cooling (dispersing heat) far exceeds the entropy decrease of the coffee itself. The second law ensures that organized energy naturally disperses and becomes randomized over time.

Information Theory Entropy

In information theory and computer science, entropy measures the average information content or uncertainty in data. Coined by Claude Shannon, information entropy quantifies how unpredictable or surprising a message is. A perfectly predictable message has zero entropy, while completely random data has maximum entropy. This concept is crucial in data compression, cryptography, and communication theory, helping engineers design efficient systems.

Practical Examples and Implications

Real-world examples include: a melting ice cube (increasing disorder), a deck of cards becoming shuffled (increasing randomness), heat spreading from a hot object to surroundings (energy dispersal), and a clean room becoming messy (increasing disorder). Entropy explains why perpetual motion machines cannot exist, why energy efficiency decreases with each conversion, and why the universe tends toward disorder. Understanding entropy is essential for chemistry, physics, engineering, and understanding the fundamental nature of reality.

Related Questions

Why does entropy always increase?

Entropy increases because there are statistically more disordered states than ordered states. Natural systems spontaneously transition to configurations with greater probability, which overwhelmingly tend to be more disorganized. This reflects fundamental probability rather than external forces.

How does entropy relate to the arrow of time?

Entropy provides the only fundamental distinction between past and future. Physical laws work identically backward and forward, but entropy consistently increases toward the future. This irreversible increase in disorder defines time's direction and explains why certain processes never occur in reverse.

Can entropy ever decrease?

Entropy can decrease locally in a system if energy is added from outside, as in a refrigerator cooling food. However, the total entropy of the universe always increases. Local decreases require work energy input, which creates greater entropy increases elsewhere.

Sources

  1. Wikipedia - Entropy CC-BY-SA-4.0
  2. Britannica - Entropy Standard