This was so difficult for me to understand because the author tried _so hard_ to make this sound poetic.
Here's what I got..
- For a long time we thought that any information (matter/light) that goes into a blackhole is lost forever and is "corrupted".
- Hawking believed this for a long time and said “God not only plays dice, but he often throws them where they can’t be seen." No one really knows _how_ the blackhole actually "corrupted" the information but had some nutty theories.
- 30 years later (in 2004) Hawking changed his mind and said that information can actually be retrieved from a blackhole.
- A dude named Andrew Strominger recently discovered that black holes have this "soft hair" property that can be "read" to theoretically "see" what is inside the blackhole.
- Hawkings last paper says that he thinks the information inside will be re-emitted when the black hole evaporates.
TL;DR: Hawking for a long time thought matter/information that went into a blackhole was lost forever - and then changed his mind about it.
FWIW, my understanding is that black holes are the physics equivalent of a cryptographic mixing function as used in eg chacha20; reversible in the strict sense, but missing any single bit of output completely 'random'ises the recovered input.
Assuming Hawking radiation exists (which seems very likely based on what we know about relativistic and quantum physics), it must carry quantum information in the form of position/momentum, photon polarization, etc, and it's not clear where else that information could possibly come from. (Orthogonal-basis measurements can sort of generate classical information out of nothing, but not in a sense that's useful here.)
If only that were true; that'd be no problem at all.
The problem is procedural, and has to do with slicing up spacetime-filling fields into field-values on spacelike hypersurfaces (values-surfaces). I'll focus on one procedure -- there are others that have their place as well.
In a spacetime without any black holes at all, we can take any such values-surface whereupon all the values are specified, and from that we can recover all the values of the spacetime-filling fields everywhere in the spacetime. This is the https://en.wikipedia.org/wiki/Initial_value_formulation_(gen...
The important thing about the initial value formulation is that we can on our chosen values-surface perturb a single field-value, and trace the consequences to neighbouring values-surfaces, and their neighbouring values-surfaces, and eventually recover the whole set of spacetime-filling fields everywhere in the spacetime. Indeed, one family of slicing, https://en.wikipedia.org/wiki/Hamiltonian_constraint#Hamilto... lends itself to https://en.wikipedia.org/wiki/Canonical_quantum_gravity (CQG). CQG works everywhere in the absence of strong gravity, and even provides a clear definition of strong gravity in terms of renormalization: http://www.preposterousuniverse.com/blog/2013/06/20/how-quan... (Below I'll generalize this to the Effective Field Theory (EFT)).
If we have no black holes, and no early singularity, the effective theory is almost certainly correct everywhere in the space-time. (Here I won't even consider the early universe problem; there is a problematical ultradense phase in the Hot Big Bang model that requires beyond-the-standard-model physics that wrecks fields-of-the-standard-model values-surfaces before we get to strong gravity.)
If we add black holes, but without Hawking Radiation (that is, they only ever grow) on each hypersurface we have to "cut out" the field-values at the boundary of any region containing strong gravity. These regions are, crucially, well inside the event-horizons of massive black holes. That is, the EFT does not end at horizons, it ends near gravitational singularities.
While there are some annoyances, for most reasonable slicings, we can still recover the full spacetime-filling fields everywhere in spacetime. The field-values that enter the horizon are trapped within the horizon, and eventually they are trapped within our "cut out" region. As our black holes never evaporate, those field values have no impact on future slices. We have, however, found ourselves with a new constraint that picks out a direction of time: the future is the direction in which the "cut out" has no impact, but the past is one in which the "cut out" emits field-values. That's the main source of annoyance, and stresses the "initial" part of "initial values". Picking out just any surface will only guarantee you recovery of the future successor surfaces; in most cases you cannot even in principle recover the past values-surfaces, with the result that you also cannot recover the whole set of spacetime-filling fields. THIS is the incompatibility between quantum mechanics and general relativity.
(In practice, researchers -- including Hawking in his original Hawking Radiation paper -- choose to study "eternal" black holes that never grow or shrink, so that the field-values are always recoverable everywhere outside the horizon. However, because the black hole doesn't grow, you have to play some tricks to deal with matter that crosses the horizon. Those tricks lead to the negative-energy particles in Hawking's paper and in many popularizations of Hawking Radiation. In a more realistic model, one would let the black hole grow or shrink, and do away with the need for negative energy altogether, although it would not have been tractable for Hawking to take that more realistic approach in the fancifully named "Black hole explosions" paper of 1974, https://www.nature.com/articles/248030a0 ).
Let's condense the point made above: we cannot reconstruct the full past of a black hole that forms by gravitational collapse of matter. (This gives rise to the black hole uniquess theorems and in particular https://en.wikipedia.org/wiki/No-hair_theorem ).
Without black hole evaporation, we can still predict the full future.
If we add black hole evaporation via thermal Hawking Radiation, we have a new problem that breaks the future predictability as well. Black holes at every time in their history from initial collapse to final evaporation emit Hawking quanta fully determined by their no-hair parameters[1]. In a typical black hole, the mass parameter is the driving term. If one starts with an initial values-surface just before strong gravity appears, then the very next (future) values-surface probably has Hawking quanta. The spectrum of the Hawking quanta is statistical: it is, in quantum field theory terms, a mixed state. But the spectrum of all the quanta in the fields just before strong gravity arises is a pure state. In more relaxed terms, we have full knowledge of the pure state, but we can only talk in terms of statistics for the mixed state.
The problem persists across the whole of the future spacetime: a Hawking quantum can fly off to infinity, and for realistic fields (e.g. the standard model), it may interact with other matter at arbitrarily large distances from the black hole. (Hawking Radiation was initially modelled with all matter represented as a non-interacting scalar field; the field-values of the Hawking scalars propagate to infinity, but don't really matter all that much in the model. But if a small-mass black hole emits an electron-positron pair, the former could fly off and meet a proton some time in the future, and probably we would want to know about a proton gas being neutralized with the result that it may begin to collapse gravitationally, whereas in the absence of Hawking electrons, it likely would not. Although the initial model was very restricted, these sorts of implications were almost immediately clear: large scale effects can be triggered by Hawking radiation, and as Hawking radiation is inherently probabilistic, we have a cosmic Schroedinger's Cat problem.)
So, back to your words:
> cryptographic mixing function
Hawking radiation converts a pure state into a mixed state. A cryptographic mixing function converts a pure state into a pure state in a way which is hard to trace.
Now, back to this article. Hawking et al. decided to break the no-hair theorem, and to decorate black holes in such a way that you can still recover the past of a (never-evaporating, always-growing, no Hawking radiation) whole spacetime from a values-surface on which there is already strong gravity. Additionally, the same mechanism allows one to recover the whole future of the spacetime from a values-surface on which there is strong gravity (and thus Hawking radiation). The downside is that one has to have the full set of values on the fields with strong-gravity, and those will (under the idea in the OP paper) include extremely low energy "soft hair" particles (the OP paper does not decide whether "soft hair" is just photons, or may be the whole set of standard model particles; as with the original 1976 paper Hawking and his coauthors consider a restricted form representation of all matter in the spacetime).
So in a way, what they are doing is introducing a "cryptographic mixing function" to avoid producing a mixed state. You get determinism everywhere (instead of determinism before strong gravity, and probability after) in initial-values formalisms, by doing away with the no-hair theorem (which raises questions about the uniqueness of theoretical black hole models like Schwarzschild and Kerr).
It is an interesting idea that deserves further study (and will get it), but it is too early to make bets on whether it will be fully succcessful at repairing the "damage" that strong gravity does to the EFT.
Moreover, it is not an answer to the question, "what happens in strong gravity", and in particular does not prevent the formation of a gravitational singularity inside a black hole. It also has nothing to say about what happens at extremely high energies (much higher than the electroweak scale) in the early hot, dense universe.
However, just making the EFT work in a wider variety of spacetimes is a fine goal!
- --
[1] The Hawking radiation when a black hole initially forms by gravitational collapse of matter is pretty extreme and is relevant to the early black hole and its immediate environment. It's hard enough to take into account that the difficulty gets its own name: the backreaction problem. The gravitational backreaction (much less matter interactions) of hairs produced at young black holes is not mentioned in the OP paper by Hawking et al. :/
This is a significantly more... more response than I was expecting, thank you.
I do have a couple of quibbles, though:
> Hawking radiation converts a pure state into a mixed state. A cryptographic mixing function converts a pure state into a pure state in a way which is hard to trace.
This is actually specifically what I meant by "the physics equivalent of"; that is, a quantum-computational mixing function that converts a arbitrary, possibly mixed state to another, probably[0] mixed state, such that a hypothetical extra-physical observer with full knowledge of both states would see the result as uniformly pseudorandom from the range of all possible output states.
Also, I take as a sort of provisional axiom[1] that physics is time-reverible (not necessarily symmetric, although probably CPT symmetric) and therefore cannot throw away or generate new any (quantum mechanical/qubit-based, so orthogonal-basis measurements aren't) information. (Given this and something like the Bekenstein bound, that a evaporating black hole must be leaking its information to somewhere, and its radiation must be getting its information content from somewhere, are seemingly trivial.)
0: In the thermodynamics/statistical mechanics "almost certainly" sense.
1: similar to and as serious as Conservation of Energy or "Scientists are made out of atoms and cannot cause magical non-(unitary/linear/differentiable/local/CPT symmetric/Liouville uniform/deterministic/etc) events by looking at things."
> evaporating black hole must be leaking its information to somewhere
The information about the contents of the BH during its formation and growth is in the region of strong gravity. Classically, it's squashed into the gravitational singularity; fully classically the singularity is always hidden behind an event horizon, so it does no harm to predicting events outside the horizon.
However, we now add a quantum field theory to the picture.
The origin of Hawking radiation is the acceleration between observers before the formation of the strong gravity and the observers after that; the accelerated (later) observers see particles where the non-accelerated (early) observers see none. The particles appear in the dynamical spacetime around (but outside) the horizon. The reason they are there is (rougly) that the creation and annihiliation operators that line up in "unstretched" vacuum separate in "stretched" vacuum, and annihilation operators miss the created particles (that is, the annihilation happens at the right spatial coordinate, but too early or too late: the created particle is elsewhere). The analogy with Unruh radiation, which appears for accelerated observers in flat spacetime but not for unaccelerated observers in the same spacetime is not accidental. In the Unruh case, the acceleration mechanism (say, a rocket engine) is the reason the accelerated observer sees the extra particles. In the Hawking case, the acceleration mechanism for later observers is the dynamically collapsing spacetime.
If nothing exits the horizon of a black hole (at least until final evaporation; and for that we are stuck with not knowing enough about the behaviour of quantum fields in strong gravity) then the only parameters available at any instant in the (QED-filled) dynamical spacetime that is the origin of Hawking radiation are mass (1 component), charge (1 component), angular momentum (3 components), linear momentum (3 components), and spatial position (3 components). The last six components fall away for some families of observers with a suitable choice of spatial coordinates. ("Instant" in this context is a coordinate time defining a spacelike hypersurface, and one has lots of freedom there). You get a handful of extra components (individual "charges") as you go from QED to the standard model.
There have been attempts to break this picture by inter alia having things never enter the horizon in the first place, by implanting extra information in the spacetime around the black hole ("hair"), and by locking up all the infalling matter into a crystal that preserves details of the matter's microscopic states either forever or until evaporation is almost entirely complete. It is extremely hard to do this without introducing unlikely observables.
> Conservation of Energy
... is not a global symmetry of a dynamically collapsing spacetime. You only get conservation of energy locally within a suitably small region of spacetime (which can be quite large far from the collapse, assuming asymptotic flatness).
> time-reverible
Locally. This is most sharply obvious in strong curvature.
> CPT symmetric
This is a problem with unitary time evolution of any quantum system in this setting; CPT doesn't enter into it. There is neither antimatter nor chirality in the model non-interacting scalar field that exposes the information loss problem for a collapsing black hole. ("Negative energy" is only a trick used when one wants to use a static background instead of a dynamical one; it does not interact at all with its pair-partner or other "negative energy" quanta; there is no local symmetry, it is the global symmetries of the Schwarzschild solution that are being preserved through the trick. You entangle the real Hawking quanta with false quanta instead of entangling the real Hawking quanta with the spacetime (which would change the metric, which is exactly what one is trying to avoid in some studies)).
Indeed, the problem is mostly centred on "time" in the first sentence of the previous paragraph. There is no unique slicing of a general curved 4-spacetime into 3-spaces, and if one does it wrong, one gets problems (see ref to Giddings 2006 below). This is in some ways an argument that black hole information loss is mainly about the https://en.wikipedia.org/wiki/Problem_of_time .
Isn’t that What he called Hawking radiation? Didn’t think it was news at this point.
It’s funny how we talk of information “not being lost” since it’s emitted as radiation... would we be able to decypher anything? If not, it’s still lost.
Hawking Radiation, in its original form, did not allow anyone to learn the quantum information about what went into the black hole — if it was made entirely of photons; or entirely of antimatter; or made of any mix of photons, normal matter, and antimatter… all would be indistinguishable.
What’s news is that it isn’t all lost, that the starting state has any influence at all on what comes out. I am only an enthusiastic amateur at this, but I get the impression this is as surprising as the discovery of Hawking Radiation in the first place.
> but I get the impression this is as surprising as the discovery of Hawking Radiation in the first place
It should be noted that we have yet to experimentally observe Hawking radiation (it is so low-power we don't have the means to observe it). People talk about Hawking radiation as though it is a discovered phenomena -- it isn't. It's a prediction based on our understanding of quantum mechanics and some thinking about how it interacts with event horizons -- and observing it would help reinforce our understanding of the underlying theories.
Indeed no existing stellar-mass-or-above black hole is a net emitter of radiation in the current cosmological era. They swallow cosmic microwave background photons at a much higher rate than they emit Hawking radiation and thus grow slowly even if infalling matter is not present. Indeed, they're the best heat sinks in existence. The universe will be billions of times its current age before the CMB has redshifted enough to become cooler than black holes.
Hypothetical primordial blackholes [1] could be much smaller and hotter, and indeed Hawking radiation could provide one way to detect those that have survived to present day, but as of now none have been found.
ETA: I don't think we disagree at all. I slightly misread your second sentence before hitting "reply". I'll keep this reply in place because I think it amplifies and adds to your point.
Hawking radiation is always present around a dynamical black hole -- it is produced by the dynamical spacetime itself[1]. (All black holes that aren't eternal -- that includes any that form by gravitational collapse of matter -- are dynamical, and thus have Hawking radiation, even while they're growing.)
In principle we should be able to detect Hawking radiation as black holes first form, since it will backreact with the black hole, and probably even interact with the collapsing matter. Studying BH-forming supernovae and the like will lead to discoveries in this difficult area of https://en.wikipedia.org/wiki/Semiclassical_gravity as hot Hawking radiation is in principle directly observable, and there will be indirect traces.
The problem is that BH formation typically happens in a bright environment. The candidate black holes we know about aren't that young and as a result the Hawking gas will be cold enough to have negligible impact: basically no interaction with nearby matter, basically no backreaction on the black hole itself, and much colder than the surrounding environment (infalling matter including the CMB gas) and thus in practice impossible to detect directly with telescopes.
- --
[1] Well, more precisely, given Einstein-Maxwell electrovacuum and general relativity with a black hole metric, Hawking radiation is inevitable. Hawking's original work dealt with a static spacetime (i.e., an eternal, unchanging black hole) and used negative energy quanta as a trick to proxy for a dynamic spacetime. Using a dynamical black hole (i.e., one that grows and shrinks), one does not need negative energy quanta at all, much less a mechanism which tosses only those halves of pairs into the BH (in order to keep the metric unchanged from pure static Schwarzschild).
> The problem is that BH formation typically happens in a bright environment.
That must be the understatement of the week. Love it! I guess observations of failed supernovae and possible direct-collapse black holes could shed some light (hah!) on the matter.
Unfortunately, you are. The information that is lost is what went into the black hole in the first place.
It's not strictly a quantum problem: if black holes have no hair, then we cannot tell by looking at a spherically symmetric non-rotating black hole if it was formed by one spherical shell of infalling matter of mass M, or two concentric spherical shells of infalling matter of mass M/2, or three of M/3, etc. When we add electromagnetism to the picture, we get Hawking radiation inversely proportional to the black hole mass; but that mass does not encode the number of shells or their composition, just their total mass.
When we add in quantum electrodynamics, we find that Hawking radiation has a thermal spectrum (so, cold photons for a stellar-mass black hole, but when the black hole is very small you'll get electrons and positrons too; and potentially the whole zoo of particles if we use the full standard-model as the quantum field theory). But we could start with a black hole formed by squashing together neutral composites (positronium, atoms) and with some probability get out nothing but photons: no massive particles at all. With some smaller probability we get mostly photons but also electrons and positrons. The main problem is that we are stuck talking probabilistically about the spectrum Hawking quanta even if we know every single detail of what we threw into the black hole; there is no unitary evolution from known-in-every-detail state to known-in-every-detail state. The "every detail" part is the information that is lost.
There are a variety of ways one can try to deal with the conversion of "we know every detail" (a pure state, quantum mechanically) to "we can only talk probabilistically" (a mixed state, quantum mechanically), and some are listed in the wikipedia page you link to. Hawking's final paper is yet another approach, and throws away the idea that black holes have no hair; that is, a black hole cannot be described with a small number of parameters (dominated by mass and angular momentum) but rather develop an enormous number of parameters encoded as perturbations of the vacuum. Those perturbations in turn influence the spectrum of the Hawking quanta in such a way that it is fully predictable -- even though it looks like a thermal bath, the vacuum perturbations ("soft hairs") fully determine it. It an idea is worth further investigation, but is not much more compelling than several alternatives.
One problem is that when we take an exact analytical black hole solution to the Einstein Field Equations of general relativity, we have "no hair" as a mathematical theorem. If we perturb around such a solution we generate observables that closely match what we see of candidate astrophysical black holes in the sky. Hawking wants to treat astrophysical black holes as even more different than the theoretical models, and while that's not a crazy idea, it's also not very parsimonious as many many many more perturbations ("hairs") are necessary than the minimum required to match the observed systems, and it's not clear that a "no hair" black hole must be measurably different from a "soft hair" black hole.
Here's what I got..
- For a long time we thought that any information (matter/light) that goes into a blackhole is lost forever and is "corrupted".
- Hawking believed this for a long time and said “God not only plays dice, but he often throws them where they can’t be seen." No one really knows _how_ the blackhole actually "corrupted" the information but had some nutty theories.
- 30 years later (in 2004) Hawking changed his mind and said that information can actually be retrieved from a blackhole.
- A dude named Andrew Strominger recently discovered that black holes have this "soft hair" property that can be "read" to theoretically "see" what is inside the blackhole.
- Hawkings last paper says that he thinks the information inside will be re-emitted when the black hole evaporates.
TL;DR: Hawking for a long time thought matter/information that went into a blackhole was lost forever - and then changed his mind about it.