What Was It Like When The First Stars Began Illuminating The Universe?
“After the Big Bang, the Universe was dark for millions upon millions of years; after the glow of the Big Bang fades away, there’s nothing that human eyes could see. But when the first wave of star formation happens, growing in a cosmic crescendo across the visible Universe, starlight struggles to get out. The fog of neutral atoms permeating all of space absorbs most of it, but gets ionized in the process. Some of this reionized matter will become neutral again, emitting light when it does, including the 21-cm line over timescales of ~10 million years.
But it takes far more than the very first stars to truly turn on the lights in the Universe. For that, we need more than just the first stars; we need them to live, burn through their fuel, die, and give rise to so much more. The first stars aren’t the end; they’re the beginning of the cosmic story that gives rise to us.”
We like to think of the Universe evolving as a story that follows a particular order: first we had the Big Bang, then things expanded and cooled, then gravitation pulled things into clumps, we formed stars, they lived and died, and now here we are. But in reality, things are messier than that! The very first stars didn’t immediately spread light throughout the Universe, but instead had a cosmic ocean of neutral atoms to contend with: one that they weren’t energetic enough or numerous enough to break through. The first stars in the Universe fought a battle against the clumping, neutral, atomic-based matter that surrounded them… and lost.
This Is Why Dark Energy Must Exist, Despite Recent Reports To The Contrary
“We do not do science in a vacuum, completely ignoring all the other pieces of evidence that our scientific foundation builds upon. We use the information we have and know about the Universe to draw the best, most robust conclusions we have. It is not important that your data meet a certain arbitrary standard on its own, but rather that your data can demonstrate which conclusions are inescapable given our Universe as it actually is.
Our Universe contains matter, is at least close to spatially flat, and has supernovae that allow us to determine how it’s expanding. When we put that picture together, a dark energy-dominated Universe is inescapable. Just remember to look at the whole picture, or you might miss out on how amazing it truly is.”
20 years ago, the supernova data came back with an extraordinary surprise: it looked like the Universe wasn’t just expanding, but that the expansion rate was increasing as we head further into the future. While there were many dark energy skeptics to start, the increased flow of improved data from many lines of evidence that all kept pointing to the same conclusion has led to a cosmological consensus: dark energy dominates the Universe today. Last week, a story made waves, as Subir Sarkar and collaborators published their second paper (the first was in 2016) claiming that the evidence from supernovae is not good enough to support the existence of dark energy, and our cosmological foundation for it is extraordinarily shaky.
Hubble: Andromeda Is Big, Massive, And Full Of The Stars Our Milky Way Is Missing
“The low-density, outer halo contains stars just as ancient as the Milky Way’s oldest: 13+ billion years of age. Andromeda has stellar streams populating that halo, with a third of those stars just 6-8 billion years old.
This means a major act of galactic cannibalism recently occurred.”
Every few years, a new study comes out claiming that the Milky Way may rival the Andromeda galaxy for the status of largest within our local group. Nonsense! Andromeda is practically double the diameter, contains anywhere from 2.5 to 5 times as many stars, and now there’s evidence that it gobbled up a number of massive galaxies relatively recently. Not only does it have stars just as old as the oldest we’ve ever found in the Milky Way, but we now have evidence that, 6-8 billion years ago, it devoured a large member of our local group entirely, with about a third of Andromeda’s halo stars having formed at around that time. When the Milky Way-Andromeda merger finally comes, there can be no doubt that the remnants of Andromeda will dominate whatever’s left.
Ask Ethan: Does The Measurement Of The Muon’s Magnetic Moment Break The Standard Model?
“[There’s a notable] difference between theory and experiment [for the muon’s magnetic moment]. Is the fact that the [uncertainties are large] more meaningful than the >3 sigma significance calculation? The Mercury precession must have a very small sigma, but is cited as a big proof of relativity. What is a good measure of significance for new physics results?”
Whenever theoretical predictions and experimental results disagree, that’s surely a sign of something interesting. If we’re extremely lucky, it might be a sign of new fundamental physics, which could mean new laws of nature, new particles, new fields, or new interactions. Any of these would be revolutionary, and certainly it’s the great hope of anyone who works on these projects: to peel back the curtain of reality and find the next layer inside. But there are two other possibilities, far more conservative and mundane, that must be ruled out first. One is an error, either on the theoretical or experimental side, that has simply been overlooked. The other is even more subtle, though: an effect from a known physical cause that’s at the heart of this discrepancy, which we haven’t thought we needed to include until now.
“But adding more and more modifications to your theory — making your model objectively more complicated — will of course have the power to offer you a better fit to the data. In general, the number of new free parameters your idea introduces should be far smaller than the number of new things it purports to explain. The great power of science is in its ability to predict and explain what we see in the Universe. The key is to do it as simply as possible, but not to oversimplify it any further than that.
Bad scientific theories abound, rife with unnecessary complications, extra sets of parameters, and unconstrained, ill-motivated speculations. Unless there’s a reality check coming, in the form of experimental or observational data, it isn’t worth wasting your time on.”
When you look at any phenomenon in the Universe, one of the major goals of scientific investigation is to understand its cause. If we see something occur, we want to know what made it happen. Quantitatively, we want to understand what processes were at play, and how they caused the effect of the exact magnitude that we observed. And finally, we want to know what to expect for systems we have not yet observed, and to make predictions about what behavior we’re likely to see in novel situations we may encounter in the future. You can find dime-a-dozen ideas, from professional physicists to philosophers to amateur enthusiasts, but most of them make lousy scientific theories.
Normally one speaks of living things as beings that consume energy to survive and proliferate. This is of course not correct; energy is conserved, and cannot be consumed. Living beings intercept entropy flows; they use low-entropy sources of energy and emit high entropy forms of the same energy (body heat).
Can we burn information as fuel? Consider a really frugal digital memory tape, with one atom used to store each bit:
The position of a single ideal gas atom denotes a bit. If it is in the top half of a partitioned box, the bit is one, otherwise it is zero. The side walls of the box are pistons, which can be used to set, reset, or extract energy from the stored bits. The numbers above the boxes are not a part of the tape, they just denote what bit is stored in a given position.
The tape is a series of boxes, with each box containing one ideal gas atom. The box is split into two equal pieces by a removable central partition. If the atom is in the top half of the box, the tape reads one; if it is in the bottom half the tape reads zero. The side walls are frictionless pistons that may be used to push the atom around. If we know the atom position in the n-th box, we can move the other side wall in, remove the partition, and gradually retract the piston to its original position destroying our information about where the atom is, but extracting useful work.
Extracting energy from a known bit is a three-step process: compress the empty half of the box, remove the partition, and retract the piston and extract PdV work out of the ideal gas atom. (One may then restore the partition to return to an equivalent, but more ignorant, state.) In the process, one loses one bit of information (which side of the the partition is occupied).
A memory tape can therefore be used to power an engine. If the engine knows or can guess the sequence written on the tape, it can extract useful work in exchange for losing that information.
Statistical Mechanics, J. Sethna
This is an old post that is bouncing around, but it’s not actually true. The memory tape is not powering the engine, the gas in the chamber is. Or, if your gas is in thermal contact with a heat reservoir with some temperature T, you’re extracting work from the reservoir (T ln 2 of work).
What you have done is run an engine without a cold reservoir. Which is interesting! The relationship to information is entropy, not information. We can’t burn information as fuel, but we can use information to extract work we’d otherwise not be able to. This seems maybe really pedantic, but it’s super important. The correspondence between information and entropy is quite deep.
Since this post is getting some attention, comments and questions, I suggest you guys have a look at the reference at the bottom of the text, that will explain the matter better than any tumblr post.
Scientists Admit, Embarrassingly, We Don’t Know How Strong The Force Of Gravity Is
“The gravitational constant of the Universe, G, was the first constant to ever be measured. Yet more than 350 years after we first determined its value, it is truly embarrassing how poorly known, compared to all the other constants, our knowledge of this one is. We use this constant in a whole slew of measurements and calculations, from gravitational waves to pulsar timing to the expansion of the Universe. Yet our ability to determine it is rooted in small-scale measurements made right here on Earth. The tiniest sources of uncertainty, from the density of materials to seismic vibrations across the globe, can weave their way into our attempts to determine it. Until we can do better, there will be an inherent, uncomfortably large uncertainty anywhere the gravitational phenomenon is important. It’s 2018, and we still don’t know how strong gravity actually is.”
Of all the fundamental constants in the Universe, such as Planck’s constant, the speed of light, or the mass of the electron, only one of them can lay claim to being the first one to be identified and measured to any degree of accuracy. That is G, the gravitational constant, first determined decades before Newton’s work: in the mid-17th century. Yet even today, scientists performing the experiments can’t agree on whether it’s 6.672 or 6.676 (or somewhere in between) x 10^-11 N/kg^2/m^2. Experiments are coming out all the time, claiming precisions to just a few parts-per-million, yet they disagree with one another at the level of nearly a part in a thousand, making G the least well-determined fundamental constant of all.
What Was It Like When There Were No Stars In The Universe?
“It takes just half a million years to take all the normal matter in the Universe and have it be completely neutral, but 100-to-200 times as long before that neutral matter can collapse down enough to form the very first star in the Universe. Until that happens, the only light to see will be the leftover glow from the Big Bang, which falls to low enough energies to make it invisible after just 3 million years. For 47-to-97 million years, the entire Universe is truly dark. But as the first star ignites, “let there be light” is finally, once again, a part of our cosmic history.”
When you hear the term dark ages, you generally think about a time where whatever illuminated humanity’s existence ceased to do so. Well, the Universe itself had a dark ages. After neutral atoms first formed, there was still a hefty, visible glow of leftover radiation from the Big Bang, but the expansion of the Universe finally makes it invisible after about 3 million years. Yet the first stars in our cosmic history, emerging from the largest, rarest density fluctuations of all, won’t arrive until the Universe is 50-to-100 million years old. These cosmic dark ages are real, and vital to our existence. There’s an incredible story to them, and a reason why no visible, energetic light could have existed during this time.
This Is Why There Are No Alternatives To The Big Bang
“For more than 50 years, no alternative has been able to deliver on all four counts. No alternative can even deliver the Cosmic Microwave Background as we see it today. It isn’t for lack of trying or a lack of good ideas; it’s because this is what the data indicates. Scientists don’t believe in the Big Bang; they conclude it based on the full suite of observations. The last adherents to the ancient, discredited alternatives are at last dying away. The Big Bang is no longer a revolutionary endpoint of the scientific enterprise; it’s the solid foundation we build upon. It’s predictive successes have been overwhelming, and no alternative has yet stepped up to the challenge of matching its scientific accuracy in describing the Universe.”
The last adherents to alternative theories to the Big Bang are at last dying away. Advocates of tired light, steady-state, or plasma cosmologies have ceased arising among the scientific ranks for one reason: these ideas cannot even explain the Cosmic Microwave Background observations, much less the full suite of the four major cornerstones of the Big Bang. When all we had were Hubble’s data and the evidence for the expanding Universe, it was a great idea to explore all the conceivable alternatives. Now that the data has come in, the alternatives have been scientifically falsified, and the Big Bang is the foundation we use as the base for our future theorizing.
NASA Images Show A Record Recovery From History’s Worst National Park Wildfire
“In 1988, 36% of the land in Yellowstone National Park — 793,880 acres — burned in one giant conflagration.
A combination of lightning strikes, human-caused fires, and parched conditions created the out-of-control blaze.
By time the cool, wet weather arrived in late autumn, tens of millions of trees were destroyed, along with innumerable plants.
41% of the burned area experienced crown fires, obliterating the forests there.
Yet natural regrowth and regeneration began immediately.”
Wildfires, both natural (from lightning strikes) and human-caused (from negligent behavior) have been particularly severe throughout the American west in the past few years. But despite it all, the greatest fire in recorded history to occur in any National Park remains the 1988 Yellowstone Fire. In that year, over a third (nearly 800,000 acres) of land burned, through a combination of ground fires and crown fires. Yet now, 30 years after the fact, the landscape has recovered almost completely through natural processes. The plans we have for managing the land through events like these are top-notch; all that is required is the funding and human-power to execute them properly.