Deep(ish) Dive: Entropy (Part 1)

While trying to understand certain topics of A-Level Chemistry & Physics, one concept proved to be quite the hurdle: entropy. These posts will detail my current understanding of it – if I do make mistakes do leave a comment! Now, let’s jump straight to the question:

What is Entropy?

Textbook definitions include…

  • a measure of disorder
  • a measure of dispersion of energy

…which are not wrong, but don’t give us a lot of information. Let’s do a more ground-up approach.

First, we need to realise that the word entropy is used in 2 different fields: information theory & thermodynamics.
The different types of entropy are similar but do not describe the exact same things!

Here’s a summary:

In this post, we’ll cover thermodynamic entropy as described via statistical mechanics.

In future posts, we’ll look at:

  • information entropy to understand why logarithms are used
  • the more experimental definition of thermodynamic entropy used in Physics, Chemistry & Engineering!

I may or may not cover more nuances in future posts, but for now, let’s start with…


Statistical Mechanics

Before defining Entropy, there are 2 concepts we need to understand:

Microstatethe specific arrangement & conditions of components of a system at a specific time
Macrostatethe overall condition of an entire system at a given time

First, let’s simplify things with an INCOMPLETE analogy:

Imagine we have some particles which can take specific positions in a box, divided into halves A & B.
Particles can travel between the halves, & I take a snapshot of the box at a specific instant.
For now let’s ignore the orientation of particles within each section, & only look at which particles are in which section.

Each specific arrangement of particles is like a separate microstateHere are all the possible microstates:



There are 8 microstates in total.
Every microstate which results in the same overall condition of the system is grouped into the same macrostateLet’s group them by number of particles in A & B:



There are 4 macrostates in total.

The NUMBER of microstates for a certain macrostate is known as its multiplicity, Ω.

From this example, you may notice that some macrostates contain MORE microstates than others. We can map them out using a…

Probability Distribution

MacrostateMultiplicity, ΩProbability, P
3 in A, 0 in B11/8 = 12.5%
2 in A, 1 in B33/8 = 37.5%
1 in A, 2 in B33/8 = 37.5%
0 in A, 3 in B11/8 = 12.5%

This system only has 3 particles – what if there were more?
Let’s try 4…

MacrostateΩP
4 in A, 0 in B16.25%
3 in A, 1 in B425%
2 in A, 2 in B637.5%
1 in A, 3 in B425%
0 in A, 4 in B16.25%

As you can see, when the number of particles in the system increases,

  • more microstates exist
  • more macrostates exist
  • the multiplicity for uniform macrostates becomes very large while the multiplicity for non-uniform macrostates becomes very small
  • the macrostates with uniform distributions start to become much more probable than others

If we keep increasing the number of particles up to billions & billions, the probability distribution would look more like this:

A normal distribution!

The above example relies on an assumption:

ALL microstates are equally likely.
All macrostates are NOT equally likely.
The probability of any macrostate depends ONLY on the number of microstates.

This is an important concept to understand. See if these statements make sense before moving on:

It is EQUALLY PROBABLE for the particles to be in any specific arrangement…

These 2 arrangements are equally as likely.
You have an equal probability of finding the gas in either of these exact arrangements.
However,
it is MUCH MORE PROBABLE for the particles to be distributed uniformly
than it is for them to collect in one side of the box


Thus, at any moment, you are more likely to find the gas spread out than collected in one corner!
Got it? If not, here’s a great video to visualise this.
There’s also a hilarious quote from Professor Feynman related to this!

With this assumption, we can conclude a few things:

  1. A macrostate with more microstates is MORE PROBABLE.
    Higher Multiplicity = More Likely to Occur.
  2. With a large number of particles, it is MUCH more likely for macrostates with uniform distributions to occur.

In real life, systems include billions of billions of particles – so these conclusions almost always apply!

Hold on to this idea – we’ll return to it at the end.


Wait! That example was incomplete.

We only considered the position of those particles.

How about a complete picture?
When dealing with thermodynamic systems, the complete definitions are:

Microstatea set of the microscopic properties
(position, velocity, energy level, etc.)
of each individual particle of a system
Macrostatea set of the macroscopic thermodynamic properties
(temperature, pressure, volume, density, etc.)
possessed by the system as a whole

For example, imagine 5 moles of a gas.

  • An example of 1 Macrostate: the gas having a Pressure of 100kPa, a Temperature of 298K, a Volume of 100cm3, & containing 5 moles of atoms.
  • Under this macrostate, there are a HUGE number of Microstates: each one is a different list of the exact location, velocity & rotation of each of the 3.01 x 1024 atoms.

Changing the position or velocity of a single particle in this system will alter its microstate, BUT if the thermodynamic properties (P, V, T, n, etc.) of the overall gas remains the same, its macrostate has NOT changed!

Some macrostates have far more microstates than others:

This macrostate has a HUGE number of microstates:A substance is at 298K at a specific pressure, volume, density, etc.

It has a HUGE multiplicity.
This macrostate has 1 microstate:A substance is at 0K at a specific pressure, volume, density, etc.



All particles are completely motionless & possess no kinetic energy.
It has a multiplicity of 1.

Just like before, macrostates with a HIGHER multiplicity will be MORE PROBABLE.


So what is Thermodynamic Entropy?

A measure of the number of microstates which result in a certain thermodynamic macrostate.

  • For the first example above, there are a HUGE number of microstates (because there are 1010^10^… ways the particles’ position, velocity, rotation, etc. could be distributed to result in those gas conditions). Thus, we say it has a HIGH ENTROPY.
  • Compare this to the substance at absolute zero. There are much fewer ways the particles could be arranged (because they all must have 0 kinetic energy). This system has a LOW ENTROPY.

How exactly is entropy related to the number of microstates?
The Entropy of a macrostate is directly proportional to the LOGARITHM of its multiplicity.
A common version of the entropy formula is:

S = k ln Ω

Where
S = entropy
k = Boltzmann Constant, 1.38 × 10−23 J K−1
Ω = multiplicity (number of microstates)

Note: this is the simplified formula described by Boltzmann – it is useful for systems at equilibrium, but might deviate in some systems (see notes at the end of this post)!

For example,

A system with 1099 microstates (Ω = 1099)……has an entropy of 3.15 × 10−21 J K−1
A system with 1 microstate (Ω = 1)……has an entropy of 0 J K-1

If you notice, even a large Ω like 1099 still results in quite a small entropy value (in the order of 10−21 J K−1).
In reality, substances have multiplicities of 1010^10^… since there are so many ways to distribute energies & velocities & positions & rotations & … (the list goes on)!

It’s impossible to identify each microstate, so there are other ways of calculating entropy for real substances. I won’t be covering them here, but there will be resources below if you’d like to learn more.

Here are some standard entropy values of real substances:

SubstanceStandard Molar Entropy
(entropy per mole at standard conditions)
Hydrogen Gas131 J K-1 mol-1
Liquid Water189 J K-1 mol-1
Oxygen Gas205 J K-1 mol-1
Graphite (Carbon)6 J K-1 mol-1
Diamond
(also Carbon, but a different allotrope)
2 J K-1 mol-1
NaCl72 J K-1 mol-1
Sources: [1], [2]

You might be wondering,

Why is Entropy measured in logarithms?
We will briefly touch Information Theory in a future post, but for now, it’s good to know that logarithms are useful in some ways:

  • Logarithms reduce HUGE Ω into smaller, workable numbers
  • Combining 2 systems into 1 MULTIPLIES their multiplicities! Think about probabilities: 2 events A AND B occurring would require you to multiply P(A) & P(B). Similar thing goes with microstates.
  • Combining 2 systems into 1 ADDS their entropies.

Since log(A) + log(B) = log(AB),
using ln is handy when dealing with multiple systems!


Almost done with this post!

You might have heard the statement:

Entropy must always increase in the Universe.

– Physics, I guess

But, WHY must Entropy always increase?
PROBABILITY.

As we have seen before,

Macrostates with a HIGHER multiplicity will be MORE PROBABLE.

The DIFFERENCE in probability is HUGE:

  • you are 99.99999999…% likely to find a gas uniformly distributed in a container
  • you are 0.000000000…% likely to find all the gas particles bunched up in a corner

Thus, it is practically impossible for a system to spontaneously take a macrostate with low entropy. It will always tend towards macrostates with high entropy.

This is the 2nd Law of Thermodynamics.

The Entropy of a closed system can never decrease over time.

– The 2nd Law of Thermodynamics

This is why, when isolated from surroundings,

Gases will diffuse to fill a larger space
Heat will transfer from hot to cold
(e.g. ice will melt when placed in a warm room)
Pieces of shattered glass will scatter away from their origin

But, I can compress gas using a pump! I can freeze water into ice! I can rearrange pieces of glass back in place! Doesn’t this decrease entropy?
It DOES decrease entropy – locally.

The problem is that doing these things cause entropy elsewhere to increase:

  1. The pump requires you to do work on the gas, & the gas heats up (heating up the surroundings)
  2. A refrigerator’s pipes heat up, & the heat escapes to the air
  3. Your muscles generate heat as you pick up & rearrange the pieces

No matter what you try, entropy will always increase in the system as a whole. Since we all exist in a huge system called “The Universe”, entropy must always increase in the Universe over time!

The fact that all of these processes involve ‘heat’ is a hint at the relationship between heat & entropy – a topic we will cover next time.


Notes & Resources

  1. Note: The assumption that all microstates are equally probable MAY NOT be true!
    • For complex systems in real life (fluids in a chamber, a car engine, your body, etc.), some microstates are more likely if the system is not at equilibrium. This requires a more complete description known as Gibbs Entropy.
    • For simple systems at equilibrium, microstates can be considered to be equally likely. For these, the Boltzmann Entropy formula (S = k ln Ω) suffices. Many sources quote this formula since it’s useful in basic physics problems
  2. Resources:
  3. Relevant posts on this site:
  4. Other concepts you might be interested in (a lot more in-depth so you might want to revisit these once you have a solid grasp of entropy):

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s