Category: entropy

No, Thermodynamics Does Not Explain Our Perceived Arrow Of Time

“As far as we can tell, the second law of thermodynamics is true: entropy never decreases for any closed system in the Universe, including for the entirety of the observable Universe itself. It’s also true that time always runs in one direction only, forward, for all observers. What many don’t appreciate is that these two types of arrows — the thermodynamic arrow of entropy and the perceptive arrow of time — are not interchangeable.

During inflation, where the entropy remains low and constant, time still runs forward. When the last star has burned out and the last black hole has decayed and the Universe is dominated by dark energy, time will still run forward. And everywhere in between, regardless of what’s happening in the Universe or with its entropy, time still runs forward at exactly that same, universal rate for all observers.

If you want to know why yesterday is in the immutable past, tomorrow will arrive in a day, and the present is what you’re experiencing right now, you’re in good company. But thermodynamics, interesting though it may be, won’t give you the answer. As of 2019, it’s still an unsolved mystery.”

No matter who you are, where you are, or what you’re doing, you’ll always perceive time running forward, from your frame of reference, at exactly the same rate: one second-per-second. The fact that this is true has led many to speculate as to what the cause of time’s arrow might be, and many, having noticed that entropy never decreases in our Universe, place the blame squarely on thermodynamics as the root of our arrow of time.

But that’s almost certainly not the case, and we can demonstrate that fact in a number of ways, including by decreasing entropy in a region and noting that time still moves forwards. The perceived arrow of time is still a mystery.

Ask Ethan: How Dense Is A Black Hole?

“I have read that stellar-mass black holes are enormously dense, if you consider the volume of the black hole to be that space which is delineated by the event horizon, but that super-massive black holes are actually much less dense than even our own oceans. I understand that a black hole represents the greatest amount of entropy that can be squeezed into [any] region of space expressed… [so what happens to the density and entropy of two black holes when they merge]?”

The entropy of a black hole, if you simply applied the laws of General Relativity (and nothing else), would simply turn out to be zero. By giving it a quantum description, however, we can get a meaningful formula for entropy: the Bekenstein-Hawking equation. When two black holes merge, the entropy is greater than even the pre-existing entropies combined.

If you think that’s weird, you might suspect that your instinct for density would also be incorrect. Sure, density is just mass divided by volume, but which volume do we use for a black hole? The volume of the event horizon? The volume of a (volume-less) singularity? Something else?

The question of how dense a black hole is has a lot of potential pitfalls, but if we follow the physics closely, we can answer it. Here’s how it’s done.

Ask Ethan: Why Is The Black Hole Information Loss Paradox A Problem?

“Why do physicists all seem to agree that the information loss paradox is a real problem? It seems to depend on determinism, which seems incompatible with QM.”

There are a few puzzles in the Universe that we don’t yet know the answer to, and they almost certainly are the harbingers of the next great advances. Solving the mysteries of why there’s more matter than antimatter, what dark matter and dark energy are, or why the fundamental particles have the masses they do will surely bring physics to the next level when we figure them out. One much less obvious puzzle, though, is the black hole information loss paradox. It’s true that we don’t yet have a theory of quantum gravity, but we don’t need one to see why this is a problem. When matter falls into a black hole, something ought to happen to keep it from simply losing its information; entropy must not go down. Similarly, when black holes evaporate, a la Hawking radiation, that information can’t just disappear, either.

So where does it go? Are we poised to violate the second law of thermodynamics? Come find out what the black hole information paradox is all about, and why it compels us to find a solution!

Ask Ethan: How Will Our Universe End?

“When will our universe reach the point of maximum entropy? And what other possibilities exist for our universe in the far future?”

It’s nearly 14 billion years since the hot Big Bang gave rise to our observable Universe, which now consists of some 2 trillion galaxies spread out across a sphere over 46 billion light years in radius. But despite how plentiful the matter in our Universe is, it won’t last forever. The stars will all burn out, and even the new stars that form will eventually run out of gas to form from. Dark energy will drive the unbound galaxies away, while gravitation will pull the bound ones into a single structure. Over time, ejections and mergers occur, littering the Universe with isolated masses and setting up enormous black holes embedded in dark matter halos as the last remnants of galaxies. After enough time passes, the final black holes decay, leaving only low-energy, ultra-high-entropy radiation behind.

It will take a long time, but this is the ultimate fate of everything in the far future of the Universe!

We Still Don’t Understand Why Time Only Flows Forward

“It’s true that entropy does explain the arrow of time for a number of phenomena, including why coffee and milk mix but don’t unmix, why ice melts into a warm drink but never spontaneously arises along with a warm beverage from a cool drink, and why a cooked scrambled egg never resolves back into an uncooked, separated albumen and yolk. In all of these cases, an initially lower-entropy state (with more available, capable-of-doing-work energy) has moved into a higher-entropy (and lower available energy) state as time has moved forwards. There are plenty of examples of this in nature, including of a room filled with molecules: one side full of cold, slow-moving molecules and the other full of hot, fast-moving ones. Simply give it time, and the room will be fully mixed with intermediate-energy particles, representing a large increase in entropy and an irreversible reaction.”

Why does time flow forwards and not backwards, in 100% of cases, if the laws of physics are completely time-symmetric? From Newton’s laws to Einstein’s relativity, from Maxwell’s equations to the Schrödinger equation, the laws of physics don’t have a preferred direction. Except, that is, for one: the second law of thermodynamics. Any closed system that we look at sees its entropy only increase, never decrease.

Could this thermodynamic arrow of time be responsible for what we perceive as the forward motion of time? Interestingly enough, there’s an experiment we can perform: isolate a system and perform enough external work on it to force the entropy inside to *decrease*, an “unnatural” progression of entropy. What happens to time, then? Does it still run forward?

Find out the answer, and learn whether thermodynamics has anything to do with the passage of time or not!

Are Space, Time, And Gravity All Just Illusions?

“Sound waves emerge from molecular interactions; atoms emerge from quarks, gluons and electrons and the strong and electromagnetic interactions; planetary systems emerge from gravitation in General Relativity. But in the idea of entropic gravity — as well as some other scenarios (like qbits) — gravitation or even space and time themselves might emerge from other entities in a similar fashion. There are well-known, close relationships between the equations that govern thermodynamics and the ones that govern gravitation. It’s known that the laws of thermodynamics emerge from the more fundamental field of statistical mechanics, but is there something out there more fundamental from which gravity emerges? That’s the idea of entropic gravity.”

There are many attempts out there to reconcile the quantum field theories that describe the electromagnetic and nuclear forces with general relativity, which describes the gravitational force. Certain questions, about gravitational properties in strong fields and on small scales, will never be answered otherwise. In order to make that happen, we’d need a quantum theory of gravity. While string theory is the most popular idea, there are others, such as asymptotic safety, loop quantum gravity, and causal dynamical triangulations. But perhaps the most radical idea came from Erik Verlinde in 2009: the idea that gravity itself is not fundamental, but rather arises from a truly fundamental entity: the entropy of quantum bits of information. Verlinde’s work has been intriguing and especially controversial, and I myself have spotted a number of problem areas with his results so far, but it’s certainly an idea worth exploring further. At 7 PM ET / 4 PM PT tonight, he delivers the Perimeter Institute’s inaugural public lecture of their 2017-2018 series.

What will he say? And what will I have to say when I weigh in on it? Find out then on our live-blog of Verlinde’s talk tonight! 

Normally one speaks of living things as beings that consume energy to survive and proliferate. This is of course not correct; energy is conserved, and cannot be consumed. Living beings intercept entropy flows; they use low-entropy sources of energy and emit high entropy forms of the same energy (body heat).

Can we burn information as fuel? Consider a really frugal digital memory tape, with one atom used to store each bit:

The position of a single ideal gas atom denotes a bit. If it is in the top half of a partitioned box, the bit is one, otherwise it is zero. The side walls of the box are pistons, which can be used to set, reset, or extract energy from the stored bits. The numbers above the boxes are not a part of the tape, they just denote what bit is stored in a given position.

The tape is a series of boxes, with each box containing one ideal gas atom. The box is split into two equal pieces by a removable central partition. If the atom is in the top half of the box, the tape reads one; if it is in the bottom half the tape reads zero. The side walls are frictionless pistons that may be used to push the atom around. If we know the atom position in the n-th box, we can move the other side wall in, remove the partition, and gradually retract the piston to its original position destroying our information about where the atom is, but extracting useful work.

Extracting energy from a known bit is a three-step process: compress the empty half of the box, remove the partition, and retract the piston and extract PdV work out of the ideal gas atom. (One may then restore the partition to return to an equivalent, but more ignorant, state.) In the process, one loses one bit of information (which side of the the partition is occupied).

A memory tape can therefore be used to power an engine. If the engine knows or can guess the sequence written on the tape, it can extract useful work in exchange for losing that information.

Reference: 

Statistical Mechanics,  J. Sethna

@error-patience-victory​ replied to post

would you reccommend number 6 to first year Physics students ?

I would; I think that if you’re familiar with the concept of entropy in thermodynamics, you’ll be able to understand the general concept. I guess the part that might be a bit obscure is quantum information (I at least did quantum mechanics in year 2, don’t know about other universities). I’ve discussed with someone on here about entropy and information before, if you want to have a look, there are a few references i posted that I think are pretty cool .