# Deep(ish) Dive: Entropy (Part 1)

While trying to understand certain topics of A-Level Chemistry & Physics, one concept proved to be quite the hurdle: entropy. These posts will detail my current understanding of it – if I do make mistakes do leave a comment! Now, let’s jump straight to the question:

## What is Entropy?

Textbook definitions include…

• a measure of disorder
• a measure of dispersion of energy

…which are not wrong, but don’t give us a lot of information. Let’s do a more ground-up approach.

First, we need to realise that the word entropy is used in 2 different fields: information theory & thermodynamics.
The different types of entropy are similar but do not describe the exact same things!

Here’s a summary:

In this post, we’ll cover thermodynamic entropy as described via statistical mechanics.

In future posts, we’ll look at:

• information entropy to understand why logarithms are used
• the more experimental definition of thermodynamic entropy used in Physics, Chemistry & Engineering!

I may or may not cover more nuances in future posts, but for now, let’s start with…

# Statistical Mechanics

Before defining Entropy, there are 2 concepts we need to understand:

## First, let’s simplify things with an INCOMPLETE analogy:

Imagine we have some particles which can take specific positions in a box, divided into halves A & B.
Particles can travel between the halves, & I take a snapshot of the box at a specific instant.
For now let’s ignore the orientation of particles within each section, & only look at which particles are in which section.

### The NUMBER of microstates for a certain macrostate is known as its multiplicity, Ω.

From this example, you may notice that some macrostates contain MORE microstates than others. We can map them out using a…

### Probability Distribution

This system only has 3 particles – what if there were more?
Let’s try 4…

As you can see, when the number of particles in the system increases,

• more microstates exist
• more macrostates exist
• the multiplicity for uniform macrostates becomes very large while the multiplicity for non-uniform macrostates becomes very small
• the macrostates with uniform distributions start to become much more probable than others

If we keep increasing the number of particles up to billions & billions, the probability distribution would look more like this:

The above example relies on an assumption:

### ALL microstates are equally likely.All macrostates are NOT equally likely.The probability of any macrostate depends ONLY on the number of microstates.

This is an important concept to understand. See if these statements make sense before moving on: Got it? If not, here’s a great video to visualise this.There’s also a hilarious quote from Professor Feynman related to this!

With this assumption, we can conclude a few things:

1. A macrostate with more microstates is MORE PROBABLE.
Higher Multiplicity = More Likely to Occur.
2. With a large number of particles, it is MUCH more likely for macrostates with uniform distributions to occur.

In real life, systems include billions of billions of particles – so these conclusions almost always apply!

Hold on to this idea – we’ll return to it at the end.

## Wait! That example was incomplete.

We only considered the position of those particles.

When dealing with thermodynamic systems, the complete definitions are:

For example, imagine 5 moles of a gas.

• An example of 1 Macrostate: the gas having a Pressure of 100kPa, a Temperature of 298K, a Volume of 100cm3, & containing 5 moles of atoms.
• Under this macrostate, there are a HUGE number of Microstates: each one is a different list of the exact location, velocity & rotation of each of the 3.01 x 1024 atoms.

Changing the position or velocity of a single particle in this system will alter its microstate, BUT if the thermodynamic properties (P, V, T, n, etc.) of the overall gas remains the same, its macrostate has NOT changed!

Some macrostates have far more microstates than others:

Just like before, macrostates with a HIGHER multiplicity will be MORE PROBABLE.

So what is Thermodynamic Entropy?

## A measure of the number of microstates which result in a certain thermodynamic macrostate.

• For the first example above, there are a HUGE number of microstates (because there are 1010^10^… ways the particles’ position, velocity, rotation, etc. could be distributed to result in those gas conditions). Thus, we say it has a HIGH ENTROPY.
• Compare this to the substance at absolute zero. There are much fewer ways the particles could be arranged (because they all must have 0 kinetic energy). This system has a LOW ENTROPY.

How exactly is entropy related to the number of microstates?
The Entropy of a macrostate is directly proportional to the LOGARITHM of its multiplicity.
A common version of the entropy formula is:

## S = k ln Ω

Where
S = entropy
k = Boltzmann Constant, 1.38 × 10−23 J K−1
Ω = multiplicity (number of microstates)

Note: this is the simplified formula described by Boltzmann – it is useful for systems at equilibrium, but might deviate in some systems (see notes at the end of this post)!

For example,

If you notice, even a large Ω like 1099 still results in quite a small entropy value (in the order of 10−21 J K−1).
In reality, substances have multiplicities of 1010^10^… since there are so many ways to distribute energies & velocities & positions & rotations & … (the list goes on)!

It’s impossible to identify each microstate, so there are other ways of calculating entropy for real substances. I won’t be covering them here, but there will be resources below if you’d like to learn more.

Here are some standard entropy values of real substances:

You might be wondering,

Why is Entropy measured in logarithms?
We will briefly touch Information Theory in a future post, but for now, it’s good to know that logarithms are useful in some ways:

• Logarithms reduce HUGE Ω into smaller, workable numbers
• Combining 2 systems into 1 MULTIPLIES their multiplicities! Think about probabilities: 2 events A AND B occurring would require you to multiply P(A) & P(B). Similar thing goes with microstates.
• Combining 2 systems into 1 ADDS their entropies.

Since log(A) + log(B) = log(AB),
using ln is handy when dealing with multiple systems!

Almost done with this post!

You might have heard the statement:

Entropy must always increase in the Universe.

– Physics, I guess

But, WHY must Entropy always increase?
PROBABILITY.

As we have seen before,

### Macrostates with a HIGHER multiplicity will be MORE PROBABLE.

The DIFFERENCE in probability is HUGE:

• you are 99.99999999…% likely to find a gas uniformly distributed in a container
• you are 0.000000000…% likely to find all the gas particles bunched up in a corner

Thus, it is practically impossible for a system to spontaneously take a macrostate with low entropy. It will always tend towards macrostates with high entropy.

This is the 2nd Law of Thermodynamics.

The Entropy of a closed system can never decrease over time.

– The 2nd Law of Thermodynamics

This is why, when isolated from surroundings,

But, I can compress gas using a pump! I can freeze water into ice! I can rearrange pieces of glass back in place! Doesn’t this decrease entropy?
It DOES decrease entropy – locally.

The problem is that doing these things cause entropy elsewhere to increase:

1. The pump requires you to do work on the gas, & the gas heats up (heating up the surroundings)
2. A refrigerator’s pipes heat up, & the heat escapes to the air
3. Your muscles generate heat as you pick up & rearrange the pieces

No matter what you try, entropy will always increase in the system as a whole. Since we all exist in a huge system called “The Universe”, entropy must always increase in the Universe over time!

The fact that all of these processes involve ‘heat’ is a hint at the relationship between heat & entropy – a topic we will cover next time.

# Notes & Resources

1. Note: The assumption that all microstates are equally probable MAY NOT be true!
• For complex systems in real life (fluids in a chamber, a car engine, your body, etc.), some microstates are more likely if the system is not at equilibrium. This requires a more complete description known as Gibbs Entropy.
• For simple systems at equilibrium, microstates can be considered to be equally likely. For these, the Boltzmann Entropy formula (S = k ln Ω) suffices. Many sources quote this formula since it’s useful in basic physics problems
2. Resources:
3. Relevant posts on this site:
4. Other concepts you might be interested in (a lot more in-depth so you might want to revisit these once you have a solid grasp of entropy):