Category: entropy

Ask Ethan: Why Is The Black Hole Information L…

Ask Ethan: Why Is The Black Hole Information Loss Paradox A Problem?

“Why do physicists all seem to agree that the information loss paradox is a real problem? It seems to depend on determinism, which seems incompatible with QM.”

There are a few puzzles in the Universe that we don’t yet know the answer to, and they almost certainly are the harbingers of the next great advances. Solving the mysteries of why there’s more matter than antimatter, what dark matter and dark energy are, or why the fundamental particles have the masses they do will surely bring physics to the next level when we figure them out. One much less obvious puzzle, though, is the black hole information loss paradox. It’s true that we don’t yet have a theory of quantum gravity, but we don’t need one to see why this is a problem. When matter falls into a black hole, something ought to happen to keep it from simply losing its information; entropy must not go down. Similarly, when black holes evaporate, a la Hawking radiation, that information can’t just disappear, either.

So where does it go? Are we poised to violate the second law of thermodynamics? Come find out what the black hole information paradox is all about, and why it compels us to find a solution!

Ask Ethan: How Will Our Universe End? “When w…

Ask Ethan: How Will Our Universe End?

“When will our universe reach the point of maximum entropy? And what other possibilities exist for our universe in the far future?”

It’s nearly 14 billion years since the hot Big Bang gave rise to our observable Universe, which now consists of some 2 trillion galaxies spread out across a sphere over 46 billion light years in radius. But despite how plentiful the matter in our Universe is, it won’t last forever. The stars will all burn out, and even the new stars that form will eventually run out of gas to form from. Dark energy will drive the unbound galaxies away, while gravitation will pull the bound ones into a single structure. Over time, ejections and mergers occur, littering the Universe with isolated masses and setting up enormous black holes embedded in dark matter halos as the last remnants of galaxies. After enough time passes, the final black holes decay, leaving only low-energy, ultra-high-entropy radiation behind.

It will take a long time, but this is the ultimate fate of everything in the far future of the Universe!

We Still Don’t Understand Why Time Only …

We Still Don’t Understand Why Time Only Flows Forward

“It’s true that entropy does explain the arrow of time for a number of phenomena, including why coffee and milk mix but don’t unmix, why ice melts into a warm drink but never spontaneously arises along with a warm beverage from a cool drink, and why a cooked scrambled egg never resolves back into an uncooked, separated albumen and yolk. In all of these cases, an initially lower-entropy state (with more available, capable-of-doing-work energy) has moved into a higher-entropy (and lower available energy) state as time has moved forwards. There are plenty of examples of this in nature, including of a room filled with molecules: one side full of cold, slow-moving molecules and the other full of hot, fast-moving ones. Simply give it time, and the room will be fully mixed with intermediate-energy particles, representing a large increase in entropy and an irreversible reaction.”

Why does time flow forwards and not backwards, in 100% of cases, if the laws of physics are completely time-symmetric? From Newton’s laws to Einstein’s relativity, from Maxwell’s equations to the Schrödinger equation, the laws of physics don’t have a preferred direction. Except, that is, for one: the second law of thermodynamics. Any closed system that we look at sees its entropy only increase, never decrease.

Could this thermodynamic arrow of time be responsible for what we perceive as the forward motion of time? Interestingly enough, there’s an experiment we can perform: isolate a system and perform enough external work on it to force the entropy inside to *decrease*, an “unnatural” progression of entropy. What happens to time, then? Does it still run forward?

Find out the answer, and learn whether thermodynamics has anything to do with the passage of time or not!

Are Space, Time, And Gravity All Just Illusions?“Sound…

Are Space, Time, And Gravity All Just Illusions?

“Sound waves emerge from molecular interactions; atoms emerge from quarks, gluons and electrons and the strong and electromagnetic interactions; planetary systems emerge from gravitation in General Relativity. But in the idea of entropic gravity — as well as some other scenarios (like qbits) — gravitation or even space and time themselves might emerge from other entities in a similar fashion. There are well-known, close relationships between the equations that govern thermodynamics and the ones that govern gravitation. It’s known that the laws of thermodynamics emerge from the more fundamental field of statistical mechanics, but is there something out there more fundamental from which gravity emerges? That’s the idea of entropic gravity.”

There are many attempts out there to reconcile the quantum field theories that describe the electromagnetic and nuclear forces with general relativity, which describes the gravitational force. Certain questions, about gravitational properties in strong fields and on small scales, will never be answered otherwise. In order to make that happen, we’d need a quantum theory of gravity. While string theory is the most popular idea, there are others, such as asymptotic safety, loop quantum gravity, and causal dynamical triangulations. But perhaps the most radical idea came from Erik Verlinde in 2009: the idea that gravity itself is not fundamental, but rather arises from a truly fundamental entity: the entropy of quantum bits of information. Verlinde’s work has been intriguing and especially controversial, and I myself have spotted a number of problem areas with his results so far, but it’s certainly an idea worth exploring further. At 7 PM ET / 4 PM PT tonight, he delivers the Perimeter Institute’s inaugural public lecture of their 2017-2018 series.

What will he say? And what will I have to say when I weigh in on it? Find out then on our live-blog of Verlinde’s talk tonight! 

Is information something real?

Normally one speaks of living things as beings that consume energy to survive and proliferate. This is of course not correct; energy is conserved, and cannot be consumed. Living beings intercept entropy flows; they use low-entropy sources of energy and emit high entropy forms of the same energy (body heat).

Can we burn information as fuel? Consider a really frugal digital memory tape, with one atom used to store each bit:

The position of a single ideal gas atom denotes a bit. If it is in the top half of a partitioned box, the bit is one, otherwise it is zero. The side walls of the box are pistons, which can be used to set, reset, or extract energy from the stored bits. The numbers above the boxes are not a part of the tape, they just denote what bit is stored in a given position.

The tape is a series of boxes, with each box containing one ideal gas atom. The box is split into two equal pieces by a removable central partition. If the atom is in the top half of the box, the tape reads one; if it is in the bottom half the tape reads zero. The side walls are frictionless pistons that may be used to push the atom around. If we know the atom position in the n-th box, we can move the other side wall in, remove the partition, and gradually retract the piston to its original position destroying our information about where the atom is, but extracting useful work.

Extracting energy from a known bit is a three-step process: compress the empty half of the box, remove the partition, and retract the piston and extract PdV work out of the ideal gas atom. (One may then restore the partition to return to an equivalent, but more ignorant, state.) In the process, one loses one bit of information (which side of the the partition is occupied).

A memory tape can therefore be used to power an engine. If the engine knows or can guess the sequence written on the tape, it can extract useful work in exchange for losing that information.

Reference: 

Statistical Mechanics,  J. Sethna

@error-patience-victory​ replied to post: would you reccommend number 6 to first year Physics…

@error-patience-victory​ replied to post

would you reccommend number 6 to first year Physics students ?

I would; I think that if you’re familiar with the concept of entropy in thermodynamics, you’ll be able to understand the general concept. I guess the part that might be a bit obscure is quantum information (I at least did quantum mechanics in year 2, don’t know about other universities). I’ve discussed with someone on here about entropy and information before, if you want to have a look, there are a few references i posted that I think are pretty cool .