Entropy Explained, With Sheep

From Melting Ice Cubes to a Mystery About Time

By Aatish Bhatia

Let’s start with a puzzle.

Why does this gif look totally normal...

Ice melting in a glass
Image: Moussa / Public Domain

...but this one look strange?

Ice melting in a glass, played in reverse

The second gif is just the first one played in reverse. But something about it immediately seems off. This just never happens. Ice melts on a warm day, but a glass of water left out will never morph into neatly-stacked cubes of ice.

But here’s the weird thing. Imagine you could zoom in and see the atoms and molecules in a melting cube of ice. If you could film the motion of any particle, and then play that film back in reverse, what you’d see would still be perfectly consistent with the laws of physics. It wouldn’t look unusual at all. The movements of the atoms and molecules in the first gif are every bit as ‘legal’ (in the court of physical law) as those in the second gif. So why is the first gif an everyday occurrence, while the reverse one impossible?

This isn’t just about ice cubes. Imagine you dropped an egg on the floor. Every atomic motion taking place in this messy event could have happened in reverse. The pieces of the egg could theoretically start on the floor, hurtle towards each other, reforming into an egg as it lifts off the ground, travel up through the air, and arrive gently in your hand. The movement of every atom in this time-reversed egg would still be perfectly consistent with the laws of physics. And yet, this never happens.

The Origin of Irreversibility

So there’s a deep mystery lurking behind our seemingly simple ice-melting puzzle. At the level of microscopic particles, nature doesn’t have a preference for doing things in one direction versus doing them in reverse. The atomic world is a two-way street.

And yet, for some reason, when we get to large collections of atoms, a one-way street emerges for the direction in which events take place, even though this wasn’t present at the microscopic level. An arrow of time emerges.

An arrow of time emerges.
The atomic world is a two-way street. But when we get to large collections of atoms, a one-way street emerges for the direction in which events take place.

Why is this?

You might’ve heard an explanation that goes like this: whenever you drop an egg, or melt an ice cube, or shatter a wine glass, you’ve increased the entropy of the world. You might also have heard the phrase, “entropy always increases”. In other words, things are only allowed to happen in one direction — the direction in which entropy increases.

But this doesn’t answer the question, it just replaces it with a new set of questions.

What is entropy, really? Why does it always keep increasing? Why don’t eggshells uncrack, or wine glasses unshatter? In this piece, my goal is to give you the tools to answer these questions.

Going down this road leads us to some of the biggest unanswered questions about the cosmos: how did our universe begin, how will it end, and why is our past different from our future?

Counting Sheep

So let’s get started. First, I’d like you to picture some sheep.

Confuddled sheep.

Say we’ve got three sheep. These sheep are shuffling about in a farm, pretty much at random. And this farm is split into three plots of lands.

There are 10 different ways that these 3 sheep can be arranged in 3 plots of land. Try to find them all, by dragging the sheep around below.

Did you find all 10 arrangements? If you missed any, click here to see them all.

Ok, here are all 10 arrangments:

10 Ways To Arrange 3 Balls in 3 Boxes

So why are we picturing this pastoral scene? Because we can use it to understand the physics of solids.

When you heat a solid, you're adding energy to it. We usually think of energy as something continuous, something that flows. But when you get down to the atomic level, quantum mechanics teaches us that energy comes in discrete chunks.

In the quantum picture, you can think of every atom like a tiny bucket for energy, into which we can place any number of energy packets.

analogy between energy and sheep

Just as the sheep wander about the plots of land in the farm, these packets of energy randomly shuffle among the atoms in the solid. So our imaginary farm is really a model of a solid, with energy (sheep) shuffled between atoms (plots of land).

Back to the farm. We saw that there were 10 ways to arrange 3 sheep among 3 plots of land. But what if, instead, there were more sheep wandering more plots of land? How many arrangements would there be? Find out by pressing the blue buttons below.

That escalated quickly.

You can see that, as you add more sheep or plots of land, the number of possible sheep arrangements grows exponentially.

Translating back to the solid, if we increase the number of packets of energy, or the number of atoms, the number of possible energy arrangements blows up. For a solid with 30 packets of energy distributed among 30 atoms (think 30 sheep on 30 plots of land), there are 59 million billion different ways to arrange the energy. And that’s only 30 atoms. The solids you encounter everyday have something like 10^24 (a million billion billion) atoms in them, and a similar number of energy packets. The number of ways to arrange the energy now becomes mind-bogglingly large.

Entropy Is All About Arrangements

At this point, you might be wondering what this has to do with entropy. Well, entropy is just a fancy word for ‘number of possible arrangements’. Entropy is a count of how many ways you can rearrange the ‘insides’ of a thing (its microscopic internals), while keeping its ‘outwardly’ (macroscopic) state unchanged. (Technically it’s the log of the number of these arrangements, but that’s just a mathematical convenience and doesn’t affect our discussion.)

So, for example, if you gave me a balloon, I could measure certain things about the gas inside it – its pressure, volume, temperature, and so on. These numbers record the macroscopic state of the gas. Given this handful of numbers, there are umpteen ways in which the gas molecules might be arranged inside the balloon. They could have different positions, and whiz about in different directions, with different speeds. So there’s a massive number of internal, microscopic arrangements (in this case, the positions and velocities of the gas molecules) that all result in the same external state (pressure, volume, and temperature of the gas). Entropy is a count of all these arrangements.

molecules inside a balloon
The entropy of the air inside a balloon counts all the ways that the air molecules can be arranged while maintaining the same overall temperature, pressure, and volume.

Going back to our solid, we saw that as we made it bigger, by adding more atoms, or hotter, by adding more packets of energy, the number of possible energy arrangements blew up. In other words, the entropy increased.

Notice in the case of the balloon, we were talking about arrangements of gas molecules, while in the case of the solid, we’re talking about arrangements of energy packets (and in the farm, we’re arranging sheep). Entropy is all about arrangements of things, but the actual things being arranged can be different in each case.

Putting The Pieces Together

We can now begin to answer the question we set out with. Why does entropy increase?

To see why, we’re going to need two solids and stick them together. Let’s say each solid has 3 atoms. We’ll start one of them hot, with 6 packets of energy, and the other cold, with no energy. The solids can exchange energy freely between themselves.

Two Einstein Solids Exchanging Energy

In sheep land, we’ve removed the fence between two neighboring farms, each with 3 plots of land, and 6 sheep are roaming freely between them.

What will happen? Click on the farm below to see for yourself.

On the left hand side of the sheep simulation above, you can watch the sheep shuffling between the farms at random. There doesn’t seem to be any pattern to their motion.

On the right hand side, you'll find a graph (a histogram) that keeps track of every 'sheep state' you encounter. (The state here refers to how many sheep are in the top farm, and how many on the bottom.) And if you watch it go for a while, you'll notice something interesting.

It turns out, you're most likely to find the sheep more or less evenly split between the two farms (the states labelled 2, 3, or 4 on the graph). Meanwhile, states where the sheep are all in one farm (labelled 0 or 6 on the graph) aren't as likely. (If you don't see this as yet, press one of the blue buttons to speed up the simulation.)

So although the sheep are shuffled at random, over time, a pattern emerges. Some states are more likely than others.

Why Sheep Spread Out

To see why, let’s count arrangements. Here are all the ways to arrange the sheep between the two farms.

If you add those blue bars up, you'll find they add up to 462. So there are 462 ways to arrange the 6 sheep in this situation.

Now, let’s assume the sheep are equally likely to be in any of these 462 arrangements. (Since they move randomly, there's no reason to prefer one arrangement over another.) In that case, we're most likely to find the sheep evenly distributed between the two farms, because this state has the most arrangements (it's the biggest bar in the graph).

In physics-speak, the sheep are most likely to be in the highest entropy state.

Meanwhile, there are only 28 possible arrangements for the state we started off in, with all 6 sheep on the top farm (the bar to the right of the graph). So we're less likely to find the sheep in this lower entropy state.

It’s important to keep in mind that there’s no special guiding force driving these sheep to spread out and increase entropy. It’s just that there are fewer ways to keep the sheep concentrated, and more ways to spread them out.

Comparing a low and high entropy state
There are fewer arrangements where the sheep (energy) are concentrated, and more arrangements where the sheep (energy) are spread out.

Let's translate this back to the physics world. We started off with energy unequally distributed between two solids. Then we put the solids together, and let them exchange energy freely. Over time, we discovered the most likely outcome was for both solids to share the energy evenly. The hot object cooled down, and the cool object warmed up. As this happened, their entropy increased.

Just like for the sheep, there’s no new law of physics that ‘tells’ the energy to spread out, or the entropy to go up. There are simply more ways to spread energy out than to keep it contained, so that's what we should expect to see happen. Higher entropy states are more probable than lower entropy ones.

We’re beginning to see how that ‘one-way street’ emerges.

But, in this tiny system, odd things can happen. Here’s that graph again, showing all the different states.

There’s something curious here. Although the equal energy state is the most likely, it’s still very possible to find all the energy in one solid (the left or right extremes of the graph). In fact, there's about a 1 in 8 chance of this happening. So the entropy in this system can fluctuate, sometimes going up, other times going down.

So when we look at really tiny solids, energy doesn’t always flow from a hot object to a cold one. It can go the other way sometimes. And entropy doesn’t always increase. This isn't just a theoretical issue, entropy decreases have actually been seen in microscopic experiments.

Why Big Is Different

So what’s different about big things? To find out, turn up the number of energy packets, and then the number of atoms, and watch what happens to our arrangements graph.

The odds of finding the system in the least likely state are about 1 in

You should see that as you make things bigger, the graph of arrangements becomes more sharply peaked, centered around the most likely (i.e. the highest entropy) state.

The left and right ends of the graph represent states where the energy is mostly contained in one solid, and these become very unlikely. Meanwhile, the middle of the graph represents states where the energy is evenly distributed between both solids, and these become increasingly likely.

Remember that when we stuck two tiny solids together (with 3 atoms each and 6 packets of energy), there was about a 1 in 8 chance of finding all the energy concentrated in one solid. Those aren't terrible odds... you probably wouldn't bet a lot of money against this outcome.

But, when we scale up these solids to having 50 atoms each, and sharing 50 units of energy, the odds of finding all the energy in one solid is about 1 in 133 billion. Now that's starting to look like a much safer bet!

graphs of entropy as number of particles increase
As you add more pieces to your system, its entropy graph becomes steeper and steeper. So you're increasingly likely to find it at a state near the peak.

And that’s just 50 atoms. When we get to an object as big as an ice cube in a glass of water, with something like 10^25 molecules, this entropy graph becomes incredibly sharply peaked, and you're guaranteed to be right near the peak. The odds of seeing entropy decrease are effectively zero, not because any physical law compels it to be so, but because of sheer statistics — there are overwhelmingly more ways for the energy to be spread out than there are ways for the energy to be contained.

We’ve just discovered why “entropy always increases”, a statement known to science as the Second Law of Thermodynamics.

The Solution to Our Mystery

We’ve covered a lot of ground here, so it’s worth pausing to recap. We started by asking why so many things in our lives happen in one direction, but never in reverse. Ice melts, but a glass of water on a warm day never turns into a block of ice. Eggs crack, but never uncrack; wine-glasses shatter, but never unshatter.

The mystery is, at the level of atoms and molecules, each of these processes are reversible. But when we get to bigger collections of atoms, a kind of one-way street emerges — a macroscopic irreversibility arises from microscopically reversible parts. Things spontaneously happen in the direction of increasing entropy, never in the opposite direction.

Now we know why. There’s no microscopic law telling any particle which direction to go, just like there’s no shepherd telling the sheep where to go in our imaginary farm. It’s just that there are more ways to spread energy around, and fewer ways to keep energy confined. Increasing entropy is highly likely, decreasing it is basically impossible. It’s just stuff obeying the laws of chance.

Entropy and Our Lives

The workings of our entire planet, including all the processes of life, ride on a wave of increasing entropy.

All life on Earth relies on the energy we get from the sun. Sunlight is made up of concentrated, low-entropy parcels of energy. Our planet chews up this useful energy, uses it for its inner workings, and spits out the remainder as heat — a more spread out (and therefore higher-entropy) form of energy.

The well from which we draw this low-entropy energy is the Sun. Like all stars, our Sun radiates away its concentrated energy, increasing its entropy, and slowly coming into equilibrium with the cold vacuum of space. One day our well will run out, as the Sun will run cold.

Where Are We Going?

Maybe you see where this is going. Extrapolating to the distant future, all stars will extinguish, all galaxies will radiate away their warmth, and our universe will reach a state of thermal equilibrium, no part of it any warmer or colder than any other part. Our universe will have hit peak entropy.

And equilibrium is a boring place. There can be no life, no machines, and no meaningful change in equilibrium. This doomsday scenario is called the heat death of the universe, and it’s how cosmologists currently expect our universe to end. But don’t lose sleep over it, it won’t happen for another googol (that’s 10^100) years, and we’ll all be dead long before (I really did start this sentence trying to help).

How Did We Get Here?

There’s one last mystery lurking. We learned that the entropy of our universe keeps rising, and this is because higher entropy states are more probable than lower entropy ones. Based on this, we can extrapolate that our universe must have started off in a very improbable state of very low entropy.

Nobody really knows why things began this way (although some folks have their guesses). But thank goodness for it, because everything interesting that has ever happened (and ever will happen) was a consequence of this unlikely beginning.

The story of our universe is that of climbing Mt. Entropy, beginning in the fiery low-entropic depths of the Big Bang, and making its way to the summit, a cold and barren state of thermal equilibrium. Both the base and the peak of Mt. Entropy are utterly inhospitable to life. But somewhere along those rising slopes, the conditions were just right for complex and wondrous things to emerge, things like trees and jellyfish and heartache and cheesecakes and yes, even melting cubes of ice.