.comment-link {margin-left:.6em;}

2Physics

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, September 25, 2011

A Gravitational Wave Observatory Operating Beyond the Quantum Shot-Noise Limit

[L to R] Hartmut Grote, Roman Schnabel, Henning Vahlbruch













Authors: Hartmut Grote, Roman Schnabel, Henning Vahlbruch

Affiliation: Institute for Gravitational Physics, Leibniz Universität Hannover and Max-Planck-Institute for Gravitational Physics (Albert-Einstein-Institute, AEI), Hannover, Germany

Quantum-squeezed light has now gone beyond the development-phase in the confinement of laboratory, and is now improving the sensitivity of the German/British gravitational-wave (GW) observatory GEO600 located close to Hannover (Germany). This is what our recent publication in Nature Physics reports [1]. Arguably -- for the first time -- a technology, which exploits the special findings of quantum mechanics, is put into a real application in metrology. The idea that squeezed states of light might be valuable in the field of GW detection is already 30 years old [2], but only now realized.

Past 2Physics article by this group:
April 03, 2008: "Squeezed Light – the first real application starts now"
by Roman Schnabel and Henning Vahlbruch

Gravitational waves are predicted by Einstein’s general theory of relativity, and are generated, for example, by black-hole binary systems. In principle, they can even be observed on earth by kilometer-scale Michelson-type laser interferometers measuring the changes in distance between mirrors suspended in vacuum. However, so far they have not been observed directly. Fig. 1 shows one of the suspended mirrors of the GW observatory GEO600 (mirror at bottom right). More details about GEO600 can be found in Ref. [3].

Fig. 1: One of four suspended 5 kg test-masses of the GEO600 gravitational wave detector (mirror at bottom right) together with reaction and suspension point masses. Changes in the 600 m distance between the test-masses are now measured with new quantum-squeezed laser light (image by courtesy of Harald Lück, AEI).

In the past, its measurement sensitivity at frequencies above several hundred hertz has been limited by the vacuum (zero-point) fluctuations of the electromagnetic field. Now, GEO600 incorporates an additional laser – a squeezed light laser – which has previously been presented in Ref. [4]. This new laser operates below its laser threshold and is based on parametric down-conversion of 532nm light thereby producing quantum correlated quasi-monochromatic photon-pairs at 1064 nm.































Fig. 2: View into the GEO600 central building. The squeezed-light laser is shown in the front. Its optical table is surrounded by several vacuum chambers containing suspended interferometer optics such as the mirror shown in Fig.1.

This rather dim laser mode is matched into the GEO600 laser interferometer where it interferes with the observatory’s ordinary 3 kW laser beam. As a result, the vacuum fluctuations at the photo-diode in GEO600’s output port are reduced (“squeezed”). GEO600 now operates with its best ever sensitivity being 50% higher than before at signal frequencies above 1 kHz, as shown in Fig. 3. Our success has finally proven the qualification of squeezed light as a key technology for future GW astronomy.





















 

Fig. 3: Nonclassical reduction of the GEO600 instrumental noise (calibrated to a space strain) using squeezed vacuum states of light. The black trace shows the observatory noise spectral density without the injection of squeezed light. An injection of squeezed vacuum states into the interferometer leads to a broadband noise reduction of up to 50% (3.5 dB in power, red trace). The peaks are not due to gravitational waves. They appear at well-known frequencies and are mainly due to violin modes of the mirror’s pendulum suspensions.

During the past years GEO600 has been made one of the most sensitive measuring devices ever built. Up to one hundred scientists from Germany, UK and other countries have contributed [5]. All of them are also members of the LIGO Scientific Collaboration (LSC) [6]. A number of new technologies have been developed, some of which are by now also implemented in the other gravitational wave observatories, namely the US LIGO and the Italian/French Virgo project. In this course of steady improvements, “classical” techniques have been driven to its extremes and GEO600 eventually became so sensitive that the squeezing technology became worth the effort.

Squeezed light has been generated in several research laboratories in the world before, however, it is a rather involved technique and leaving the conditions of a laboratory to an environment of continuous operation is difficult. All these aspects explain why only now the squeezed light is used for the first time in a GW observatory. In the past 5 years about a dozen of physicists have been working on the squeezed laser development in Hannover to enable this leap [7]. A recent review article summarizes the progress on squeezed light generation over the past years [8].

We are convinced that squeezed light will be used in all GW observatories around the globe in near future. The squeezing technology is certainly not exhausted yet. We believe that improvements of up to 200% are feasible with current technology.

References
[1] The LIGO Scientific Collaboration, "A gravitational wave observatory operating beyond the quantum shot-noise limit", Nature Physics, doi:10.1038/nphys2083 (Published online September 11, 2011). Abstract. Free Download.
[2] C. M. Caves, "Quantum-mechanical noise in an interferometer". Phys. Rev. D 23, 1693 (1981). Abstract.
[3] Willke, B. et al., "The GEO 600 gravitational wave detector", Class. Quantum Grav. 19, 1377 (2002). Abstract; Grote, H. et al., "The GEO 600 status", Class. Quantum Grav. 27, 084003 (2010). Abstract.
[4] H. Vahlbruch, A. Khalaidovski, N. Lastzka, C. Gräf, K. Danzmann, R. Schnabel, "The GEO600 squeezed light source", Class. Quantum Grav. 27, 084027 (2010). Article.
[5] The GEO600 team: www.geo600.org
[6] The LIGO scientific collaboration: www.ligo.org
[7] The Quantum Interferometry Group at the AEI: www.qi.aei-hannover.de
[8] R. Schnabel, N. Mavalvala, D. E. McClelland, P. K. Lam, "Quantum metrology for gravitational wave astronomy", Nature Communications, 1:121, doi: 10.1038/ncomms1122 (2010). Abstract.

Labels: ,


Sunday, September 18, 2011

Controlling Complexity in Cuprates for New Quantum Devices

Antonio Bianconi

Author: Antonio Bianconi

Affiliation: Department of Physics, Sapienza University of Rome, Italy

The quantum state of matter made of a macroscopic quantum condensate that is able to resist the decoherence attacks of high temperature was discovered by Alex Muller and Georg Bednoz -- specifically in ceramic conductors. Today we know many different high temperature superconducting materials from ceramics to diborides, from iron pnictides to chalcogenides that share a common material spatial architecture. As shown in Figure 1, they are heterostructures at atomic limit made of superconducting atomic layers (11) intercalated by different spacer layers (12).

Figure 1: schematic picture of a generic high temperature superconductor made of superconducting layers (11) intercalated by spacer layers (12). The dopants are inserted in the spacer layers (12) to control the critical temperature.

The high-temperature superconductivity (HTS) emerges in these lamellar materials when defects called dopants, atomic substitutions or extra (interstitial) or missing (vacancy) oxygen atoms, are injected in spacer layers (12). The dopants roam around in the spacer layers of the material at high temperature, and they freeze in ordered or random patterns when the samples are cooled.

The high temperature superconducting properties of these ceramic and intermetallic materials depend on the complex interplay of multiple electronic components that -- below the critical temperature -- form multigap superconductors made of a mixture of quantum condensates coupled by shape resonances (also known as Feshbach or Fano resonances). These complex electronic phases occur on a “low energy scale” of tens of hundreds of milli-electronvolts, i.e. at the low energy scale of the thermal energy at room temperature (KBT =25 milli-electron volts).

It remained a puzzle for so many years that at such a fine energy scale the distribution of dopants is expected to modify the electronic structure. Therefore learning how to control dopant distribution will open the new field in material science “superconductors by design”, since it will allow us to manipulate the multiple superconducting gaps and the critical temperature.

For many years most scientists assumed a homogeneous distribution of dopants in spite of indications of complexity emerged in the early nineties in the works of Jorgensen -- who observed the increase of Tc waiting for mobile oxygen ions to get self organized in the time scale of months. This was supported by observation of photo-induced superconductivity i.e. the increase of Tc by shining a laser light on the sample, and by our findings in 1994 that lattice fluctuations in the spacer layers are related to structural nano-scale phase separation of distorted stripes and undistorted stripes in the superconducting CuO2 plane modulating the superconductor electronic structure.

Ceramics have been around as long as human civilizations have been around for many thousands years. Our ancestors knew how to fire pots, and different civilizations are characterized by innovations in this technology. We have started to think that the imaging and control of the spatial organization of the dopants will allow us the manipulate these new ceramics at the nanoscale to get new functional materials. In the latest issue of 'Nature Materials' Poccia et al [1] show that actually these superconducting ceramics can be manipulated on a nanoscale with X-rays to get complex materials by design.

Last year we used [2] a new microscopy method developed thanks to the advances in focusing to nanoscale the X-ray beam emitted by synchrotron radiation facilities. This has allowed us to unveil the sample complexity by recording thousands of x-ray diffraction patterns for each crystal. This novel technique provided a complex real space map of the order in the k-space of oxygen interstitials in a cuprate where the oxygen interstitials are mobile. The fractal structure that emerged was typical complex systems ranging from social networks as "Facebook", and opinion dynamics to networks of protein interactions in living matter that show the complex world of "no scale" statistical distributions. We discovered that the best superconductivity was obtained when the microstructure was most ‘connected’ (see Figure 2), meaning that it is possible to trace a path with the same nanostructure (exhibited by oxygen atoms) over a large distance. If we zoom in on the material’s structure at increasing levels of magnification, its appearance would remain the same.























Figure 2: A fractal network of connected channels of ordered oxygen defects in a ceramic copper oxide promotes superconductivity. The green and red spheres represent the paired electrons responsible for superconductivity.


The physics of these materials was therefore expected to show fluctuations over states in complex rough potential landscape like in some soft materials with time evolution over multiple time scales. It has taken years for us to learn how to change the internal structure of a copper oxide superconductor via simple heat treatments – an approach employed by ceramicists over millennia to modify oxide materials.

To see whether the fractal pattern was important, we interfered with it by heating and then quickly cooling the superconductor. Crystals with stronger fractal patterns performed better as a superconductor at higher temperatures than those with weaker fractal patterns.

Ordering takes place on a time scale of months, and little was known about the details of this process.

We discovered that illuminating with X-rays causes a small scale re-arrangement of the oxygen atoms in the material, resulting in high temperature superconductivity. We have been attracted by this "beautiful example of a non-equilibrium, disordered system finding equilibrium". Here in these ceramics X-rays bring the oxygen interstitials into equilibrium, while they usually would cause radiation damage.

Illuminating a disordered sample by with x-rays fast-forward the ordering dynamics, and x-ray diffraction (XRD) was used to monitor the evolution of order, both parallel (a and b axes) and perpendicular (c axis) to the CuO2 layers. The XRD data reveal that the initial sample has small almost isotropic ordered islands, which act as nuclei. They initially combine, then grow predominantly in the a-b plane, and finally along the c axis. The unveiled details of the out-off- equilibrium domain growth process shed light on the previously unknown statistical physics features of these complex systems.























Figure 3: The illumination of the ceramic materials by x-ray beams of synchrotron radiation in the upper part of the figure is shown to allow to write superconducting planar circuits, such as those depicted in the lower image showing a magnification of the sample area. Here, solid lines indicate electrical connections while semicircles denote superconducting junctions, whose states are indicated by red arrows. (Credit: UCL Press Office).


At this point we learned how to manipulate the order of oxygen interstitials on a nano scale with X-rays. The X-ray beam is used like a pen to draw shapes (dots and lines) in two dimensions. Using this new approach the feasibility to write superconductors with dimensions down to nanometers and to erase those structures by applying heat treatments is shown. This new tool allows us to write and erase new superconducting circuits with high precision using just a few simple steps and without the chemicals ordinarily used in device fabrication. This ability to re-arrange the underlying structure of a material has wider applications in similar compounds containing metal atoms and oxygen, ranging from fuel cells to catalysts.

Our validation of a one-step, chemical-free technique to generate superconductors opens up exciting new possibilities for electronic devices, particularly in re-writing superconducting logic circuits. Of profound importance is the key to solving many of the world's great computational challenges. We want to create computers on demand to solve this problem, with applications from genetics to logistics. A discovery like this [1] means that a paradigm shift in computing technology is advanced one step closer.

References:
[1] Nicola Poccia, Michela Fratini, Alessandro Ricci, Gaetano Campi, Luisa Barba, Alessandra Vittorini-Orgeas, Ginestra Bianconi, Gabriel Aeppli, Antonio Bianconi. "Evolution and control of oxygen order in a cuprate superconductor". Nature Materials, DOI: 10.1038/nmat3088 (Published online August 21, 2011). Abstract.
[2] Michela Fratini, Nicola Poccia, Alessandro Ricci, Gaetano Campi, Manfred Burghammer, Gabriel Aeppli, Antonio Bianconi. "Scale-free structural organization of oxygen interstitials in La2CuO4+y. Nature 466, 841–844 (2010) doi:10.1038/nature09260. Abstract.


[More information about this work can be obtained at Superstripes Press]

Labels:


Sunday, September 11, 2011

The Quantum von Neumann Architecture: A Programmable Quantum Machine Based on a Quantum Random Access Memory and a Quantum Central Processing Unit

Matteo Mariantoni

Author: Matteo Mariantoni

Affiliation: Department of Physics, University of California at Santa Barbara, USA

A classical computer is based on both a hardware, i.e., a suitable set of micro-sized wires typically patterned on a silicon chip, and a software, i.e., a sequence of operations or code. Such operations are realized as electrical signals ‘running’ through the wires of the hardware. The combination of hardware and software is termed a computational architecture.

In the mid 1940's, John von Neumann, J. Presper Eckert, and John Mauchly revolutionized the abstract concept of a universal Turing machine by proposing the eponymous ‘von Neumann architecture.’ Their implementation, which involves a central processing unit and a memory to hold data and instructions, provided a practical approach to constructing a classical computer, a design that is at the heart of almost every computing device made today.

In the past decade, researchers in various fields, from nuclear magnetic resonance to quantum optics, from trapped ions to semiconductor- and superconductor-based quantum circuits, have been able to create and control in the lab many of the building blocks at the basis of what, in the near future, could represent a quantum computer. As a classical computer is based on bits, a quantum computer is based on quantum bits (qubits), 0 and 1. One of the main differences between classical bits and qubits is in that a qubit can be prepared in a so-called superposition state, where both states 0 and 1 are possible at the same time. In addition, in a quantum computer pairs of qubits can be prepared in ‘mixtures’ of two-qubit states, which are called entangled states. The immense power of a quantum computer resides in the combination of superposition states and entangled states. This combination will eventually allow us to perform calculations much faster than with any classical computer and to solve problems that would be otherwise impossible by classical means.

One of the critical challenges of quantum computing is to assemble together in a single machine all the hardware components needed for a quantum computer and to program these components using quantum codes, thus allowing us to implement a quantum-mechanical computational architecture. In particular, such an architecture should be scalable and immune from computational errors. This would represent a so-called scalable fault-tolerant quantum-mechanical architecture.

At University of California at Santa Barbara (UCSB), in the group headed by John Martinis and Andrew Cleland, we use superconducting quantum circuits as qubits. These are wires typically made of aluminum that, once cooled below a temperature of approximately -272 degrees Celsius become superconduting, thus drastically reducing dissipation effects and noise. When cooled further down to almost absolute zero temperature, our superconducting wires start showing a quantum mechanical behavior. In such a state, an immense number of electrons begins moving collectively, as a part of a single entity. Two different ‘positions’ of this immense number of electrons moving together can then be used to create the two states 0 and 1 of a qubit.

In the past years, at UCSB as well as in other labs worldwide we have shown that it is possible to prepare and control systems with a few qubits (up to three). In particular, we have shown superposition states and entangled states, and we have been able to perform simple quantum logic gates using one and two qubits.

However, qubits alone are insufficient to implement a quantum-mechanical analogous of the von Neumann architecture: A quantum memory is needed. In the experiment to be published in the journal Science [4], we were able to fabricate a fairly complex quantum circuit comprising all the elements of a quantum von Neumann machine, integrated on a single chip. Our quantum machine includes a set of two qubits that can exchange quantum information through a so-called quantum bus. We can address each qubit and prepare it in a superposition state, and we can entangle two qubits via the bus. The two qubits and the quantum bus represent the quantum central processing unit (quCPU) of our machine.

The quantum von Neumann machine (image of the real device): Two superconducting qubits (enclosed in the two central squares) are coupled through a quantum bus (center meander line). Quantum information can be stored in two quantum memories (two lateral meander lines). A zeroing register is included in the two central squares. Photo credit: Erik Lucero

Most importantly, we were able to provide each qubit with a quantum memory. A key characteristic of our quantum memories, which are also based on superconducting wires, is that they can hold quantum information for a much longer time than the corresponding qubits. In this manner, as soon as quantum information has been processed by the quCPU, it can safely be stored into the memories. The quCPU can then be used again to process more quantum information, while storing the original quantum information in the memories. The memories can store the original quantum information for a time long enough that, if that information is needed later in the computation, it can be read out and re-used by the quCPU at any desired time. We also provided our machine with the quantum-mechanical equivalent of a delete button, a so-called zeroing register, where used-up quantum information can be dumped from the quCPU, freeing it up. Our memories and zeroing register thus realize a true quantum random access memory (quRAM). We term the combination of quCPU and quRAM as the quantum von Neumann machine.

We tested our quantum von Neumann machine by running a proof-of-concept quantum code that makes use of both qubits, the coupling bus, the memories, and the zeroing register in a single long sequence. In addition, we ran two key algorithms for quantum information processing: The quantum Fourier transform and a Toffoli gate. The quantum Fourier transform is probably the most important block in Shor’s algorithm for the factorization of large numbers, while Toffoli gates are at the basis of measurement-free quantum error correction. The latter is necessary to show a fault-tolerant quantum-mechanical architecture.

The UCSB team of researchers was led by myself, Matteo Mariantoni, Elings prize fellow and postdoctoral fellow in the Department of Physics, Andrew N. Cleland, professor of physics, and John M. Martinis, professor of physics. Our USCB quantum computing team, composed by numerous students and postdocs, largely contributed to the infrastructure used in the experiments and in the development of the concept of the quantum von Neumann architecture.

I believe that our quantum-mechanical implementation of the von Neumann architecture will serve as a guideline in the further development of quantum computing, not only with superconducting quantum circuits, but also with trapped ions and semiconductor devices.

Dr. Matteo Mariantoni was supported in this work by an Elings Prize Fellowship in Experimental Science from UCSB’s California NanoSystems Institute. The work was performed under funding from the Army Research Office and by the Intelligence Advanced Research Projects Activity (IARPA). Devices were made at the UCSB Nanofabrication Facility, a part of the NSF-funded National Nanotechnology Infrastructure Network.

References:
[1] M. Mariantoni, H. Wang, R. C. Bialczak, M. Lenander, E. Lucero, M. Neeley, A. D. O’Connell, D. Sank, M.Weides, J.Wenner, T. Yamamoto, Y. Yin, J. Zhao, J. M. Martinis & A. N. Cleland, "Photon shell game in three-resonator circuit quantum electrodynamics". Nature Physics, 7, 287-293 (2011). Abstract.
[2] M. Neeley, R. C. Bialczak, M. Lenander, E. Lucero, M. Mariantoni, A. D. O’Connell, D. Sank, H.Wang, M.Waides, J.Wenner, Y. Yin, T. Yamamoto, A. N. Cleland & J. M. Martinis, "Generation of three-qubit entangled states using superconducting phase qubits", Nature, 467, 570–573 (2010). Abstract.
[3] L. DiCarlo, M. D. Reed, L. Sun, B. R. Johnson, J. M. Chow, J. M. Gambetta, L. Frunzio, S. M. Girvin, M. H. Devoret & R. J. Schoelkopf, "Preparation and measurement of three-qubit entanglement in a superconducting circuit", Nature, 467, 574-578 (2010). Abstract.

[4] Matteo Mariantoni, H. Wang, T. Yamamoto, M. Neeley, Radoslaw C. Bialczak, Y. Chen, M. Lenander, Erik Lucero, A. D. O’Connell, D. Sank, M. Weides, J. Wenner, Y. Yin, J. Zhao, A. N. Korotkov, A. N. Cleland1, John M. Martinis, "Implementing the Quantum von Neumann Architecture with Superconducting Circuits", Science, DOI: 10.1126/science.1208517 (published online September 1, 2011). Abstract.

Labels: ,


Sunday, September 04, 2011

Black Hole Evaporation Rates without Spacetime

Samuel L. Braunstein

Author: Samuel L. Braunstein

Affiliation: Professor of Quantum Computation, University of York, UK


Why black holes are so important to physics

In the black hole information paradox, Hawking pointed out an apparent contradiction between quantum mechanics and general relativity so fundamental that some thought any resolution may lead to new physics. For example, it has been recently suggested that gravity, inertia and even spacetime itself may be emergent properties of a theory relying on the thermodynamic properties across black hole event horizons [1]. All these paradoxes and prospects for new physics ultimately rely on thought experiments to piece together more detailed calculations, each of which themselves only give a part of the full picture. Our work "Black hole evaporation rates without spacetime" adds another calculation [2] which may help focus future work.

The paradox, a simple view

In its simplest form, we may state the paradox as follows: In classical general relativity, the event horizon of a black hole represents a point of no return - as a perfect semi-permeable membrane. Anything can pass the event horizon without even noticing it, yet nothing can escape, even light. Hawking partly changed this view by using quantum theory to prove that black holes radiate their mass as ideal thermal radiation. Therefore, if matter collapsed to form a black hole which itself then radiated away entirely as formless radiation then the original information content of the collapsing matter would have vanished. Now, information preservation is fundamental to unitary evolution, so its failure in black hole evaporation would signal a manifest failure of quantum theory itself. This "paradox" encapsulates a profound clash between quantum mechanics and general relativity.

To help provide intuition about his result Hawking presented a heuristic picture of black hole evaporation in terms of pair creation outside a black hole's event horizon. The usual description of this process involves one of the pair carrying negative energy as it falls into the black hole past its event horizon. The second of the pair carries sufficient energy to allow it to escape to infinity appearing as Hawking radiation. Overall there is energy conservation and the black hole losses mass by absorbing negative energy. This heuristic mechanism actually strengthens the "classical causal" structure of the black hole's event horizon as being a perfect semi-permeable (one-way) membrane. The paradox seems unassailable.

Scratching the surface of the paradox

This description of Hawking radiation as pair creation is seemingly ubiquitous (virtually any web page providing an explanation of Hawking radiation will invoke pair creation).

Nonetheless, there are good reasons to believe this heuristic description may be wrong [3]. Put simply, every created pair will be quantum mechanically entangled. If the members of each pair are then distributed to either side of the event horizon the so-called rank of entanglement across the horizon will increase for each and every quanta of Hawking radiation produced. Thus, one would conclude that just as the black hole mass were decreasing by Hawking radiation, its internal (Hilbert space) dimensionality would actually be increasing.

For black holes to be able to eventually vanish, the original Hawking picture of a perfectly semi-permeable membrane must fail at the quantum level. In other words, this "entanglement overload" implies a breakdown of the classical causal structure of a black hole. Whereas previously entanglement overload had been viewed as an absolute barrier to resolving the paradox [3], we argue [2,4] that the above statements already point to the likely solution.

Evaporation as tunneling

The most straightforward way to evade entanglement overload is for the Hilbert space within the black hole to "leak away". Quantum mechanically we would call such a mechanism tunneling. Indeed, for over a decade now, such tunneling, out and across the event horizon, has proved a useful way of computing black hole evaporation rates [5].

Spacetime free conjecture

In our paper [2] we suggest that the evaporation across event horizons operates by Hilbert space subsystems from the black hole interior moving to the exterior. This may be thought of as some unitary process which samples the interior Hilbert space; picks out some subsystem and ejects it as Hawking radiation. Our manuscript primarily investigates the consequences of this conjecture applied specifically to event horizons of black holes.

At this point a perceptive reader might ask how and to what extent our paper sheds light on the physics of black hole evaporation. First, the consensus appears to be that the physics of event horizons (cosmological, black hole, or those due to acceleration) is universal. In fact, it is precisely because of this generality that one should not expect this Hilbert space description of evaporation at event horizons to bear the signatures of the detailed physics of black holes. In fact, as explained in the next section we go on to impose the details of that physics onto this evaporative process. Second, sampling the Hilbert space at or near the event horizon may or may not represent fair sampling from the entire black hole interior. This issue is also discussed below (and in more detail in the paper [2]).

Imposing black hole physics

We rely on a few key pieces of physics about black holes: the no-hair theorem and the existence of Penrose processes. We are interested in a quantum mechanical representation of a black hole. At first sight this may seem preposterous in the absence of a theory of quantum gravity. Here, we propose a new approach that steers clear of gravitational considerations. In particular, we derive a quantum mechanical description of a black hole by ascribing various properties to it based on the properties of classical black holes. (This presumes that any quantum mechanical representation of a black hole has a direct correspondence to its classical counterpart.) In particular, like classical black holes our quantum black hole should be described by the classical no-hair properties of mass, charge and angular momentum. Furthermore, these quantum mechanical black holes should transform amongst each other just as their classical counterparts do when absorbing or scattering particles, i.e., when they undergo so-called Penrose processes. By imposing conditions consistent with these classical properties of a black hole we obtain a Hilbert space description of quantum tunneling across the event horizons of completely generic black holes. Crucially, this description of black hole evaporation does not involve the detailed curved spacetime geometry of a black hole. In fact, it does not require spacetime at all. Finally, in order to proceed to the next step of computing the actual dynamics of evaporation, we need to invoke one more property of a black hole: that of its enormous dimensionality.

Tunneling probabilities

The Hilbert space dimensionalities needed to describe a black hole are vast (at least 101077 for a stellar-mass black hole). For such dimensionalities, random matrix theory tells us that the statistical behavior of tunneling (as a sampling of Hilbert space subsystems) is excellently approximated by treating tunneling as a completely random process. This immediately imposes a number of symmetries onto our description of black hole evaporation. We can now completely determine the tunneling probabilities as a function of the classical no-hair quantities [2]. These tunneling probabilities are nothing but the black hole evaporation rates. In fact, these are precisely the quantities that are computed using standard field theoretic methods (that all rely on the curved black hole geometry). Thus, the calculation of tunneling probabilities provides a way of validating our approach and making our results predictive.

The proof of the pudding: validation and predictions

Our results reproduce Hawking's thermal spectrum (in the appropriate limit), and reproduce his relation between the temperature of black hole radiation and the black hole's thermodynamic entropy.

When Hawking's semi-classical analysis was extended by field theorists to include backreaction from the outgoing radiation on the geometry of the black hole a modified non-thermal spectrum was found [5]. The incorporation of backreaction comes naturally in our quantum description of black hole evaporation (in the form of conservation laws). Indeed, our results show that black holes that satisfy these conservation laws are not ideal but "real black bodies" that exhibit a non-thermal spectrum and preserve thermodynamic entropy.

These results support our conjecture for a spacetime free description of evaporation across black hole horizons.

Our analysis not only reproduces these famous results [5] but extends them to all possible black hole and evaporated particle types in any (even extended) gravity theories. Unlike field theoretic approaches we do not need to rely on one-dimensional WKB methods which are limited to the analysis of evaporation along radial trajectories and produce results only to lowest orders in ℏ.

Finally, our work quite generally predicts a direct functional relation exists between the irreducible mass associated with a Penrose process and a black hole's thermodynamic entropy. This in turn implies a breakdown in Hawking's area theorem in extended gravity theories.


And the paradox itself

The ability to focus on events horizons is key to the progress we have made in deriving a quantum mechanical description of evaporation. By contrast, the physics deep inside the black hole is more elusive. If unitarity holds globally then our spacetime free conjecture can be used to describe the entire time-course of evaporation of a black hole and to learn how the information is retrieved (see e.g., [6]). Specifically, in a unitarily evaporating black hole, there should exist some thermalization process, such that after what has been dubbed the black hole's global thermalization (or scrambling) time, information that was encoded deep within the black hole can reach or approach its surface where it may be selected for evaporation as radiation. Alternatively, if the interior of the black hole is not unitary, some or all of this deeply encoded information may never reappear within the Hawking radiation. Unfortunately, any analysis relying primarily on physics at or across the horizon cannot shed any light on the question of unitarity (which lies at the heart of the black hole information paradox).

The bigger picture

At this stage we might take a step back and ask the obvious question: Does quantum information theory really bear any connection with the subtle physics associated with black holes and their spacetime geometry? After all we do not yet have a proper theory of quantum gravity. However, whatever form such a theory may take, it should still be possible to argue, either due to the Hamiltonian constraint of describing an initially compact object with finite mass, or by appealing to holographic bounds, that the dynamics of a black hole must be effectively limited to a finite-dimensional Hilbert space. Moreover, one can identify the most likely microscopic mechanism of black hole evaporation as tunneling. Formally, these imply that evaporation should look very much like our sampling of Hilbert space subsystems from the black hole interior for ejection as radiation [2,4,6]. Although finite, the dimensionalities of the Hilbert space are immense and from standard results in random unitary matrix theory and global conservation laws we obtain a number of invariances. These invariances completely determine the tunneling probabilities without needing to know the detailed dynamics (i.e., the underlying Hamiltonian). This result puts forth the Hilbert space description of black hole evaporation as a powerful tool. Put even more strongly, one might interpret the analysis presented as a quantum gravity calculation without any detailed knowledge of a theory of quantum gravity except the presumption of unitarity [2].

Hints of an emergent gravity

Verlinde recently suggested that gravity, inertia, and even spacetime itself may be emergent properties of an underlying thermodynamic theory [1]. This vision was motivated in part by Jacobson's 1995 surprise result that the Einstein equations of gravity follow from the thermodynamic properties of event horizons [7]. For Verlinde's suggestion not to collapse into some kind of circular reasoning we would expect the physics across event horizons upon which his work relies to be derivable in a spacetime free manner. It is exactly this that we have demonstrated is possible in our manuscript [2]. Our work, however, provides a subtle twist: Rather than emergence from a purely thermodynamic source, we should instead seek that source in quantum information.


In summary, this work [2,4]:
  • shows that the classical picture of black hole event horizons as perfectly semi-permeable almost certainly fails quantum mechanically
  • provides a microscopic spacetime-free mechanism for Hawking radiation
  • reproduces known results about black hole evaporation rates
  • authenticates random matrix theory for the study of black hole evaporation
  • predicts the detailed black hole spectrum beyond WKB
  • predicts that black hole area must be replaced by some other property in any generalized area theorem for extended gravities
  • provides a quantum gravity calculation based on the presumption of unitarity, and
  • provides support for suggestions that gravity, inertia and even spacetime itself could come from spacetime-free physics across event horizons

References
[1] E. Verlinde, "On the origin of gravity and the laws of Newton", JHEP 04 (2011) 029. Abstract.
[2] S.L. Braunstein and M.K. Patra, "Black Hole Evaporation Rates without Spacetime", Phys. Rev. Lett. 107, 071302 (2011). Abstract. Article (pdf).
[3] H. Nikolic, "Black holes radiate but do not evaporate", Int. J. Mod. Phys. D 14, 2257 (2005). Abstract; S.D. Mathur, "The information paradox: a pedagogical introduction", Class. Quantum Grav. 26, 224001 (2009). Abstract.
[4] Supplementary Material to [2] at http://link.aps.org/supplemental/10.1103/PhysRevLett.107.071302.
[5] M.K. Parikh and F. Wilczek, "Hawking Radiation As Tunneling", Phys. Rev. Lett. 85, 5042 (2000). Abstract.
[6] S.L. Braunstein, S. Pirandola and K. Życzkowski, "Entangled black holes as ciphers of hidden information", arXiv:0907.1190.
[7] T. Jacobson, "Thermodynamics of Spacetime: The Einstein Equation of State", Phys. Rev. Lett. 75, 1260 (1995). Abstract.

Labels: , ,