I have written about [entropy before](https://invertedpassion.com/notes-on-entropy/). This note is to expand the notion of entropy to reflect what I've learned about it ever since. In that note, I mentioned that there are two types of entropy: - Information entropy - Thermodynamic entropy When it comes to physical systems, thermodynamic entropy is what applies. Colloquially, it is defined as the amount of disorder in a system. More precisely, **it is defined to be a measure of the number of (energy) microstates of a system corresponding to a given macrostate (such as temperature).** Here microstates means the number of different ways in which a system's components can occupy a given amount of energy among them. Macrostates means what different configurations can the observer measure. So, if temperature only measures average jiggling of molecules of a body then higher the temperature, the more different ways in which for a given temperature molecules can move (which you can't tell apart because you only read the macrostate that is the temperature). This helps answer why entropy always increases in the universe ([the second law of thermodynamics](https://en.wikipedia.org/wiki/Second_law_of_thermodynamics)). Our key assumption about physics of a system is that it randomly moves across all energy microstates, so we're most likely to find it in a macrostate corresponding to most number of indistinguishable microstates. The combined number of microstates of two or more systems are much more than the simple sum of microstates of each of the individual systems (to see that, imagine that for each microstate of system 1, there can be all microstates of system 2. So you multiply the microstates of individual systems to get the microstates of the combined system). > You can easily see then that **the famous second law of thermodynamics is simply a statistical law.** It emerges from the pure logic of the situation. So, as entropy is a function of number of microstates, when you put two systems in contact, the total number of microstates available for the combined system increases and hence entropy increases. ### Entropy of the universe was low during the Big Bang You may have seen a diagram like the following which shows that high entropy is a state where the gas is completely spread out while low entropy is a state where the gas is somehow confined into a tiny corner. ![[increasing-entropy.jpg]] (via [here](https://courses.lumenlearning.com/suny-chem-atoms-first/chapter/entropy/)) This is all well and good but I've always had one confusion. As mentioned in [[Demystifying the arrow of time]], the only reason we feel time flowing from the past to the future (and not the other way around) is because the entropy at the Big Bang was very, very low. Which means that universe in the past was occupying an extremely special/unlikely state (special, by the laws of probability) and with time, just as we'd expect to happen statistically, it evolved into more and more likely state (hence entropy increases with time flow). But as mentioned in [[Mysteries of the Big Bang and Cosmic Inflation]], the early universe had an extremely uniform energy density that we today observe in an extremely uniform temperature of 2.7 Kelvin in [cosmic microwave background](https://en.wikipedia.org/wiki/Cosmic_microwave_background). So, the universe at the Big Bang was really like the diffuse gas that's spread uniformly everywhere. But fast forward to today, we observe clumped matter/energy in terms of stars, galaxies and black holes. What's happening then? If energy was spread uniformly in the past and is clumped now, aren't we saying entropy of the universe decreases with time (rather than increasing, which is what we expect since it gives us the arrow of time). ### Enter Gravity One aspect of entropy that's generally missed in most thermodynamically motivated definitions of entropy is gravity. A box of gas has negligible gravity and most of the energy is the kinetic and vibration energy of the molecules. If you neglect gravity, then it makes sense to say that the uniformly distributed gas has higher entropy (because there are more ways for the same molecules to be uniformly distributed than collect in a tiny corner). However, our early universe was gravity dominated. The energy densities was so immense that you cannot ignore gravity. And when you see what gravity does to matter/energy (which I'm using interchangeably, see [[Demystifying mass and energy]]), **you'd see that a uniformly distributed energy has low entropy because velocity/momentum of its constituents is zero, but gravitational attraction makes the particles to gain velocity/momentum and also, during collisions a portion of that velocity/momentum is transferred to emitted photons**. [In other words](https://math.ucr.edu/home/baez/entropy.html): > First of all, you have to remember that a gas cloud heats up as it collapses gravitationally! The clumping means you know more and more about the positions of the atoms in the cloud. But the heating up means you know less and less about their velocities. So there are two competing effects. It's not obvious which one wins! So, even though entropy of the clumped energy/matter reduces because of confined positions, the net entropy of the universe (which includes radiated photons with all kinds of velocities) increases. This video explains this really well. <iframe width="560" height="315" src="https://www.youtube.com/embed/gJivcAAbE0A" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> If uniform energy under gravity is low entropy, and entropy increases when that energy clumps into stars, can you think of what would be the highest entropy state under gravity? Think. I'll give you a moment. Well, it's a black hole. [Black Holes are thought to be the objects that can hold highest amount of entropy per unit volume](https://en.wikipedia.org/wiki/Black_hole_thermodynamics) of any other object in our universe as just like normal stars they radiate away energy in form of [Hawking radiation](https://en.wikipedia.org/wiki/Hawking_radiation) (and that radiation is high entropy). (The fact that a black hole cannot be observed from the outside and yet has maximal entropy introduces the [blackhole information paradox](https://en.wikipedia.org/wiki/Black_hole_information_paradox)). ### Thoughts on the Big Crunch The second law says that entropy keeps on increasing with time. Even though we now believe that [[Demystifying the first three minutes of the universe#Future of the universe|our universe will keep on expanding forever]], we can imagine an alternative fate where the universe reverses expansion at some point and starts reducing in size again. This is a scenario famously known as the Big Crunch. Will entropy of the universe start reducing as it starts to shrink? No, it'll still be increasing because as universe shrinks, more galaxies will collide with one another, emit high entropy radiation and eventually start forming black holes. And as black holes have high entropy, our universe will keep on increasing entropy. We can now clearly see the difference between the Big Bang and the Big Crunch. Even though both of these universes will be of similar (tiny) sizes compared to today's universe, the former contained extremely uniform energy while the latter will have high energy in form of radiation with randomly distributed energy/momentum. So our universe did start in extremely special conditions and why is that is [still a big mystery](https://arxiv.org/abs/1406.3057). ### Entropy is relative Entropy as a concept is relative in two ways: 1. **Entropy of a system is always relative to who is doing the observation**. If the observer can distinguish more microstates of a system, the entropy automatically reduces. The [Laplace's demon](https://en.wikipedia.org/wiki/Laplace%27s_demon) doesn't see any entropy anywhere. 2. **Absolute amount of entropy is not as useful as relative one comparing two objects or the same object at two different instances in time**. We rarely have absolute certainty (and if we have that, the concept of entropy doesn't apply). Since entropy is a measure of uncertainty, it makes sense to say in which situations are one is more or less uncertain rather than saying how uncertain one is (because one often needs a useful benchmark for uncertainty to compare their current uncertainty against previous uncertainty). This line of thought has led some scientists to suggest that [even the entropy driven time's arrow is meaningfully defined only relative to us (humans)](https://arxiv.org/abs/1505.01125). So, it could be the case that the entropy of the Big Bang was low because only under such a relational definition of entropy do beings like us exist. Perhaps under other definitions, entropy at the Big Bang was high and this would make beings like us impossible to exist. ### Is life special when it comes to entropy? On the face of it, life seems anomalous. In a universe where everything is going from low entropy to high entropy, the well-structured biological organisms seem to be going in reverse. But this is an illusion. The case with life is actually similar to the case of collapsing gas (into a star). If we count the net entropy difference in the universe due to a living being, we will always find that the entropy is higher if that being existed vs. if it never existed. So, the entropy increase due to thermal radiation emitted by a body, the excrement and other waste products (like CO2) is much higher than entropy decrease in cells and molecules comprising the same body. In that sense, there isn't any arbitrary boundary separating living beings from dead physical systems. All systems in our universe contribute to increase the entropy of the universe. **When the cosmic gas compresses to form a star, entropy increases. When a child is born, entropy increases.** The entropy keeps on increasing because it was low at the Big Bang and since entropy is a measure of our uncertainty, we become less and less certain about precise microstates of the universe with time as we simply can't track of everything. **So, inevitably, no matter what system we're dealing with, the total entropy of the universe has to increase because we become less certain of the state of the universe with time.** <iframe class="signup-iframe" src="https://invertedpassion.com/signup-collector" title="Signup collector"></iframe>