.comment-link {margin-left:.6em;}

2Physics

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, December 11, 2011

“Dressing” Atoms with Laser Allows High Angular Momentum Scattering : Could Reveal Ways to Observe Majorana Fermions

Ian Spielman (photo courtesy: Joint Quantum Institute, USA)

Scientists at the Joint Quantum Institute (JQI, a collaborative enterprise of the 'National Institute of Standards and Technology' and the University of Maryland) have for the first time engineered and detected the presence of high angular momentum collisions between atoms at temperatures close to absolute zero. Previous experiments with ultracold atoms featured essentially head-on collisions. The JQI experiment, by contrast, is able to create more complicated collisions between atoms using only lasers that dramatically influences their interactions in specific ways.

Such light-tweaked atoms can be used as proxies to study important phenomena that would be difficult or impossible to study in other contexts. Their most recent work, appearing in Science [1] demonstrates a new class of interactions thought to be important to the physics of superconductors that could be used for quantum computation.

Particle interactions are fundamental to physics, determining, for example, how magnetic materials and high temperature superconductors work. Learning more about these interactions or creating new “effective” interactions will help scientists design materials with specific magnetic or superconducting properties.Because most materials are complicated systems, it is difficult to study or engineer the interactions between the constituent electrons. Researchers at JQI build physically analogous systems using supercooled atoms to learn more about how materials with these properties work.

The key to the JQI approach is to alter the atoms’ environment with laser light. They “dress” rubidium atoms by bathing them in a pair of laser beams, which force the atoms to have one of three discrete values of momentum. In the JQI experiment, rubidium atoms comprise a Bose-Einstein condensate (BEC). BECs have been collided before. But the observation of high-angular-momentum scattering at such low energies is new.

The paper in 'Science Express' [1] includes a variety of technical issues which require some explanation:

Collisons

One of the cardinal principles of quantum science is that matter must be simultaneously thought of as both particles and waves. When the temperature of a gas of atoms is lowered, the wavelike nature of the atom emerges, and the idea of position becomes fuzzier. While an atom at room temperature might spread over a hundredth of a nm, atoms at nano-kelvin temperatures have a typical wavelength of about 100 nm. This is much larger than the range of the force between atoms, only a few nm. Atoms generally collide only when they meet face to face.

However, to study certain interesting quantum phenomena, such as searching for Majorana particles---hypothetical particles that might provide a robust means of encoding quantum information---it is desirable to engineer inter-atomic collisions beyond these low-energy, head-on type. That’s what the new JQI experiment does.

Partial Waves

Scattering experiments date back to the discovery of the atomic nucleus 100 years ago, when Ernest Rutherford shot alpha particles into a foil of gold. Since then other scattering experiments have revealed a wealth of detail about atoms and sub-atomic matter such as the quark substructure of protons.

A convenient way of picturing an interaction between two particles is to view their relative approach in terms of angular momentum. Quantized angular momentum usually refers to the motion of an electron inside an atom, but it necessarily pertains also to the scattering of the two particles, which can be thought of as parts of a single quantum object.

If the value of the relative angular momentum is zero, then the scattering is designated as “s-wave” scattering. If the pair of colliding particles has one unit of angular momentum, the scattering is called p-wave scattering. Still more higher-order scattering scenarios are referred to by more letters: d-wave, f-wave, g-wave, and so on. This model is referred to as the partial waves view.

In high energy scattering, the kind at accelerators, these higher angular-momentum scattering scenarios are important and help to reveal important structure information about the particles. In atomic scattering at low temperatures, the s-wave interactions completely swamp the higher-order scattering modes. For ultralow-temperature s-wave scattering, when two atoms collide, they glance off each other (back to back) at any and all angles equally. This isotropic scattering doesn’t reveal much about the nature of the matter undergoing collision; it’s as if the colliding particles were hard spheres.

This has changed now. The JQI experiment is the first to create conditions in which d-wave and g-wave scattering modes in an ultracold experiment could be seen in otherwise long-lived systems.

Quantum Collider

Ian Spielman and his colleagues at the National Institute for Standards and Technology (NIST) chill Rb atoms to nano-kelvin temperatures. The atoms, around half a million of them, have a density about a millionth that of air at room temperature. Radiofrequency radiation places each atom into a superposition of quantum spin states. Then two (optical light) lasers impart momentum (forward-going and backward-going motion) to the atoms.

Schematic drawing of collision between two BECs (the gray blobs) that have been “dressed” by laser light (brown arrows) and an additional magnetic field (green arrow). The fuzzy halo shows where atoms have been scattered. The non-uniform projection of the scattering halo on the graph beneath shows that some of the scattering has been d-wave and g-wave [image courtesy: JQI]

If this were a particle physics experiment, we would say that these BECs-in-motion were quantum beams, beams with energies that came in multiples of the energy kick delivered by the lasers. The NIST “collider” in Gaithersburg, Maryland is very different for the CERN collider in Geneva, Switzerland. In the NIST atom trap the particles have kinetic energies of a hundred pico-electron-volts rather than the trillion-electron-volt energies used at the Large Hadron Collider.

At JQI, atoms are installed in their special momentum states, and the collisions begin. Outward scattered atoms are detected after the BEC clouds are released by the trap. If the atoms hadn’t been dressed, the collisions would have been s-wave in nature and the observed scattered atoms would have been seen uniformly around the scattering zone.

The effect of the dressing is to screen the atoms from s-wave scattering in the way analogous to that in some solid materials, where the interaction between two electrons is modified by the presence of trillions of other electrons nearby. In other words, the laser dressing effectively increased the range of the inter-atom force such that higher partial wave scattering was possible, even at the lowest energies.

In the JQI experiment, the observed scattering patterns for atoms emerging from the collisions was proof that d-wave and g-wave scattering had taken place. “The way in which the density of scattered atoms is distributed on the shell reflects the partial waves,” said Ian Spielman. “A plot of scattered-density vs. spherical polar angles would give the sort of patterns you are used to seeing for atomic orbitals. In our case, this is a sum of s-, p-, and d- waves.”

Simulating Solids Using Gases

Ultracold atomic physics experiments performed with vapors of atoms are excellent for investigating some of the strongly-interacting quantum phenomena usually considered in the context of condensed matter physics. These subjects include superconductivity, superfluids, the quantum Hall effect, and topological insulators, and some things that haven’t yet been observed, such as the “Majorana” fermions.

Several advantages come with studying these phenomena in the controlled environment of ultracold atoms. Scientists can easily manipulate the landscape in which the atoms reside using knobs that adjust laser power and frequency. For example, impurities that can plague real solids can be controlled and even removed, and because (as in this new JQI experiment) the scattering of atoms can now (with the proper “dressing”) reveal higher-partial-wave effects. This is important because the exotic quantum effects mentioned above often manifest themselves under exactly these higher angular-momentum conditions.

“Our technique is a fundamentally new method for engineering interactions, and we expect this work will stimulate new directions of research and be of broad interest within the physics community, experimental and theoretical,” said Spielman. “We are modifying the very character of the interactions, and not just the strength, by light alone.”

On To Fermions

The JQI team, including Nobel Laureate William Phillips, is truly international, with scientists originating in the United Kingdom (lead author Ross Williams), Canada (Lindsay LeBlanc), Mexico (Karina Jiménez-García), and the US (Matthew Beeler, Abigail Perry, William Phillips and Ian Spielman).

The researchers now will switch from observing bosonic atoms (with a total spin value of 1) to fermion atoms (those with a half-integral spin). Combining the boson techniques demonstrated here with ultracold fermions offers considerable promise for creating systems which are predicted to support the mysterious Majorana fermions. “A lot of people are looking for the Majorana fermion,” says lead author and JQI postdoctoral fellow Ross Williams. “It would be great if our approach helped us to be the first.”

Reference
[1] R. A. Williams, L. J. LeBlanc, K. Jiménez-García, M. C. Beeler,A. R. Perry, W. D. Phillips, I. B. Spielman, "Synthetic partial waves in ultracold atomic collisions”, Science Express, (December 7, 2011). DOI: 10.1126/science.1212652. Abstract.

Labels: , , , ,


Sunday, November 13, 2011

A New Scheme for Photonic Quantum Computing














[From Left to Right] Nathan K. Langford, Sven Ramelow and Robert Prevedel


Authors: Nathan K. Langford, Sven Ramelow and Robert Prevedel

Affiliation: Institute for Quantum Optics and Quantum Information (IQOQI), Austria;
Vienna Center for Quantum Science and Technology, Faculty of Physics, University of Vienna, Austria

Quantum computing is a fascinating and exciting example of how future technologies might exploit the laws of quantum physics [1]. Unlike a normal computer (“classical” computer), which stores information in 0s and 1s (called “bits”), a quantum computer stores information in quantum bits (“qubits”), states of quantum systems like atoms or photons. In principle, a quantum computer can solve the exact same problems as classical computers, so why do we think they could be so fantastic? It all comes down to speed – that is, in the context of computing, how many elementary computational steps are required to find an answer.

Past 2Physics articles by Robert Prevedel:
October 23, 2011: "Heisenberg’s Uncertainty Principle Revisited"
by Robert Prevedel
June 08, 2007: "Entanglement and One-Way Quantum Computing"
by Robert Prevedel and Anton Zeilinger


For many different types of problems, classical computers are already fast – meaning that reasonable problems can be solved in a reasonable time and the time required for a “larger” problem increases only slowly with the size of the problem (known as “scaling”). For example, once you know how to add 17 and 34, it’s not that much more difficult to add 1476 and 4238. For such problems, quantum computers can’t really do any better. Some types of problems, however, can be solved much faster on a quantum computer than on a classical computer. In fact, quantum computers can actually perform some tasks that are utterly impossible for any conceivable classical computer. The most famous example is Shor’s algorithm for finding the prime factors of a large integer [2], a problem which lies at the heart of many important computing tasks. It’s straightforward to work out that the factors of 21 are 3 and 7, but it’s already much harder to work out that the factors of 4897 are 59 and 83, and modern data protection (RSA encryption) relies on this problem becoming effectively impossible on a classical computer for really big numbers (say, 50 or 100 decimal digits long). But that would not be true for a quantum computer. It turns out that quantum computers could achieve an enormous speed-up, because of the unique quantum features of superposition and entanglement.

Shor’s algorithm is a great example of the revolutionary potential for technologies built on quantum physics. The problem with such technologies, however, is that quantum systems are incredibly hard to control reliably. Classical computers are an astonishingly advanced technology: classical information can be stored almost indefinitely and the elementary computational gates which manipulate the information work every time. “It just works.” [3] By contrast, quantum information is incredibly fragile – you can destroy it literally by looking at it the wrong way! This places extremely stringent demands on what is required to control it and make it useable. In 1998, David DiVincenzo outlined a set of minimum sufficient criteria required to build a scaleable quantum computer [4] and since then experimentalists from all corners of physics have been working to fulfil them.

One of the most promising architectures for quantum information processing (QIP) and in particular quantum computing is to encode information in single photons. Because they generally interact very weakly with their environment, provided they are not absorbed accidentally, they can be used to store and transmit information without it being messed up. But this strength also creates its own problems, which arise when you want to create, manipulate or measure this information. Because a single photon doesn't interact much with atoms or other photons, it is very hard to do these things efficiently. And efficiency is the key to the whole idea of quantum computing, because the enormous quantum speed-up can only be achieved if the basic building blocks work efficiently. This is the biggest challenge for photonic QIP: current schemes for preparing single photons are inefficient and linear-optics gates are inherently probabilistic [5]. For example, pioneering work by Knill, Laflamme and Milburn showed how to overcome these problems in principle [6], but at an enormous cost in physical resources (gates, photons, etc.) which makes their approach almost completely infeasible in practice. The main goal of our approach is to make photons talk to each other efficiently.

In a recent paper [7], we introduce a new approach to photonic QIP – coherent photon conversion (CPC) – which is based on an enhanced form of nonlinear four-wave mixing and fulfils all of the DiVincenzo criteria. In photonic QIP experiments, nonlinear materials are commonly used to provide probabilistic sources of single-photon states. By shining in a strong laser beam, the nonlinear interaction will very occasionally cause a laser photon (usually around one in a billion) to split into two photons, making a very inefficient source of heralded photons. Instead, we looked at what would happen if we used a single photon instead of a strong laser beam. Surprisingly, we found that, if you can make the interaction strong enough, it should be possible to make the efficiency of photon splitting rise to 100% – something that is impossible with a laser input. In fact, we found that the same type of interaction can be used to provide a whole range of “deterministic” tools (tools that work with 100% efficiency), including entangling multiphoton gates, heralded multiphoton sources and efficient detectors – the basic building blocks required for scaleable quantum computing. Some of these are shown and briefly described in Fig. 1.




















Figure 1: Fulfilling the DiVincenzo criteria with CPC. (a) A deterministic photon-photon interaction (controlled-phase gate) based on a novel geometric phase effect. (b) A scaleable element for deterministic photon doubling, which can be used in a photon-doubling cascade (c) to create arbitrarily large multiphoton states.

Perhaps the most remarkable thing about CPC is that it should be possible to build a single all-optical device, with four I/O ports, which can provide all of these building blocks just by varying what type of light is sent into each port. And because of the underlying four-wave mixing nonlinearity, it could be compatible with current telecommunications technology and perhaps even be built entirely on a single photonic chip. This could make it much easier to build more complex networks.

LinkFigure 2: Nonlinear photonic crystal fibre pumped by strong laser pulses (7.5 ps at 532 nm) to create the nonlinearity required for CPC.

To demonstrate the feasibility of our proposed approach, we performed a first series of experiments using off-the-shelf photonic crystal fibres (see Fig. 2) to demonstrate the nonlinear process underlying the CPC scheme¬. The next step required is to identify what can be done to optimise the nonlinear coupling, both by improving the materials and engineering a better design. While deterministic operation has yet to be achieved, our results show that this should be feasible with sophisticated current technology, such as with chalcogenide glasses, which are highly nonlinear and can be used to make both optical fibres and chip-based integrated waveguides [8].

Finally, we hope that CPC will be a useful technique for implementing coherent, deterministic multiphoton operations both for applications in quantum-enhanced technologies and for fundamental tests involving entanglement and large-scale quantum systems. Interestingly, the general idea of “coherent photon conversion” can also be implemented in physical systems other than photons, such as in optomechanical, electromechanical and superconducting systems where the intrinsic nonlinearities available are even stronger.

References
[1] R.P. Feynman, "Simulating Physics with Computers". International Journal of Theoretical Physics, 21, 467–488 (1982). Article(PDF).
[2] P. W. Shor. "Algorithms for quantum computation: Discrete logarithms and factoring" (In Proceedings of the 35th Annual Symposium on Foundations of Computer Science, page 124, Los Alamitos, 1994. IEEE Computer Society Press). Abstract.
[3] Steve Jobs (2011). YouTube Video.
[4] D.P. DiVincenzo, D. Loss, "Quantum information is physical". Superlattices and Microstructures, 23, 419–432 (1998). Abstract.
arXiv:cond-mat/9710259.
[5] P. Kok, W. J. Munro, K. Nemoto, T.C. Ralph, J.P. Dowling, G.J. Milburn, "Linear optical quantum computing with photonic qubits". Review of Modern Physics, 79, 135–174 (2007). Abstract.
[6] E. Knill, R. Laflamme, G.J. Milburn, "A scheme for efficient quantum computation with linear optics". Nature, 409, 46–52 (2001). Abstract.
[7] N. K. Langford, S. Ramelow, R. Prevedel, W. J. Munro, G. J. Milburn & A. Zeilinger. "Efficient quantum computing using coherent photon conversion". Nature 478, 360-363 (2011). Abstract.
[8] B.J. Eggleton, B. Luther-Davies, K. Richardson, "Chalcogenide photonics". Nature Photonics, 5, 141–148 (2011).
Abstract

Labels:


Sunday, October 30, 2011

Gas Phase Optical Quantum Memory





















[From left to right] Ben Sparkes, Mahdi Hosseini, Ping Koy Lam and Ben Buchler


Authors: Ben Buchler, Mahdi Hosseini, Ben Sparkes, Geoff Campbell and Ping Koy Lam

Affiliation: ARC Centre for Quantum Computation and Communication Technology, Department of Quantum Science, The Australian National University, Canberra, Australia

In the early days of quantum mechanics, the Heisenberg uncertainty principle was seen as something of a problem. It limits the ways in which it is possible to measure the state of things, and as such, imposes an in principle limit to how well we can manipulate and harness measurements for technological applications. More recently, however, there has been an outbreak of proposals that suggest Heisenberg uncertainty and other quantum mechanical principles can be harnessed for advanced applications in the information sciences. Of these, quantum key distribution (QKD) is the most advanced. This technique allows the sharing of a secret key between remote parties over an open communication channel. The crucial point is that if someone tries to eavesdrop on the transmission of this key, the communication channel is disrupted due to the uncertainty principle. Only a clean communication line will allow the sharing of a key and in this way, any key that is generated is guaranteed perfectly secure. QKD has been demonstrated in optical fibres over distances over 60km [1].

Unfortunately, beyond about 100km the losses in optical fibres, or indeed any transmission medium, mean that it becomes very slow or even impossible to share a key.
One possible method to fix this problem is to build a quantum repeater [2]. These devices, which are yet to be demonstrated, will extend the range of quantum communication beyond the current limit. Integral to proposed repeaters is some kind of memory capable of storing, and recalling on demand, quantum states of light [3]. To build an ideal optical quantum memory, you need to capture a state of light with 100% efficiency without actually measuring it, since the quantum back-action from a measurement would disrupt the state. Then you have to recall it without adding noise or losing any of the light.

Our approach to building a quantum memory relies on reversible absorption in an ensemble of atoms. This is a photon-echo technique known as a “gradient echo memory” (GEM). In this scheme we organise our ensemble such that the absorption frequency of atoms varies linearly along the length of our memory cell. This is done using an applied field, such as a magnetic gradient, to shift the resonant frequency of the atoms (see Fig. 1).
















Figure 1: The GEM scheme. a: An pulse, in this example a modulated pulse, is incident on the memory cell, which has a gradient in the atomic absorption frequencies. b: The pulse is stored in the cell and due to the gradient, the atomic coherence has a spatial profile that is the Fourier transform of the pulse shape. c: After flipping the gradient, the pulse is recalled.


The bandwidth of the applied broadening can be matched to the bandwidth of the incoming light pulse. After the light is absorbed into the ensemble, the atoms dephase at a rate proportional to the spread in absorption frequencies. All that is required to recall the light pulse is to reverse the gradient. This reverses the relative frequency detunings meaning that the atomic dephasing is time-reversed and the ensemble will rephase. When this happens, the light is recalled in the forward direction.

This protocol works well in 2-level atoms, as described. Experiments with cryostatic rare-earth doped solid-state crystals have shown recall efficiencies up to 69% without added noise [4]. This experiment was the first to beat the crucial 50% limit. Above this percentage, you can be sure that a hypothetical, all-powerful eavesdropper who, in principle, could have collected all of the missing light, will have less than 50% of the original information. This means that any eavesdropper has less information about the stored state than you do. In the absence of added noise, the 50% barrier corresponds to the “no-cloning limit”.

Our GEM experiments work in a 3-level atomic system in a warm gas cell. This has several advantages: i) there are many suitable 3-level systems; ii) gas cells are can be bought off the shelf and require a small heater rather than a large cryostat and iii) the simplicity of the setup means that we can rapidly try out new protocols. With three levels, the scheme is exactly as illustrated in Fig. 1, except that the upper and lower levels are now two hyperfine ground states, which are coupled using a strong “control” beam as shown in Fig. 2.

Figure 2: The atomic level scheme

The control beam brings new flexibility to our scheme. By switching the control field off, we can suppress recall from the memory. If we have multiple pulses stored in the memory then we can thus choose which ones to recall at which time – i.e. we have a random access memory for pulses of light [5]. The Fourier nature of the memory also allows us to stretch and compress the bandwidth of the pulses, shift their frequency and recall different frequency components at different times [6]. The efficiency of our system is also the highest ever demonstrated for a quantum memory prototype with up to 87% recall [7]. We have also verified the “quantumness” of our memory by quantifying the added noise [8]. We found, by using conditional variance and signal transfer measurements, that our system easily beat the no-cloning limit. In terms of fidelity, for small photon numbers, we found fidelities as high as 98%.

Figure 3: One of our gas cells illuminated with 300mW of light at 795nm.

The current system, using sauna-temperature gas cells (around 80 degrees C) is limited by atomic diffusion. The storage times are only a few microseconds. We plan to implement our scheme on a cold atomic ensemble in the near future to improve this aspect of our system.

References
[1] D Stucki, N Gisin, O Guinnard, G Ribordy, and H Zbinden, "Quantum key distribution over 67 km with a plug & play system", New Journal of Physics, 4, 41 (2002). Article.
[2] Nicolas Gisin and Rob Thew, "Quantum communications", Nature Photonics, 1, 165–171 (2007). Abstract.
[3] Alexander I. Lvovsky, Barry C. Sanders, and Wolfgang Tittel, Optical quantum memory, Nature Photonics, 3, 706–714 (2009). Abstract.
[4] Morgan P Hedges, Jevon J Longdell, Yongmin Li, and Matthew J Sellars, "Efficient quantum memory for light", Nature, 465, 1052–1056 (2010). Abstract.
[5] Mahdi Hosseini, Ben M Sparkes, Gabriel Hétet, Jevon J Longdell, Ping Koy Lam, and Ben C Buchler, "Coherent optical pulse sequencer for quantum applications", Nature 461, 241–245 (2009). Abstract.
[6] B. C. Buchler, M. Hosseini, G. Hétet, B. M. Sparkes and P. K. Lam, "Precision spectral manipulation of optical pulses using a coherent photon echo memory", Optics Letters, 35, 1091-1093 (2010). Abstract.
[7] M Hosseini, B M Sparkes, G Campbell, P K Lam, and B C Buchler, "High efficiency coherent optical memory with warm rubidium vapour", Nature communications, 2, 174 (2011). Abstract.
[8] M. Hosseini, G. Campbell, B. M. Sparkes, P. K. Lam, and B. C. Buchler, "Unconditional room-temperature quantum memory", Nature Physics, 7, 794–798 (2011). Abstract
.

Labels:


Sunday, October 23, 2011

Heisenberg’s Uncertainty Principle Revisited

Robert Prevedel

Author: Robert Prevedel
Affiliation: Institute for Quantum Computing, University of Waterloo, Canada

In 1927, Heisenberg [1] showed that, according to quantum theory, one cannot know both the position and the velocity of a particle with arbitrary precision; the more precisely the position is known, the less precisely the momentum can be inferred and vice versa. In other words, the uncertainty principle sets limits on our ultimate ability to predict the outcomes of certain pairs of measurements on quantum systems. Such pairs of quantities can also be energy and time or the spins and polarizations of particles in various directions. The uncertainty principle is a central consequence of quantum theory and a pillar of modern physics. It lies at the heart of quantum theory and has profound fundamental and practical consequences, setting absolute limits on precision technologies such as metrology and lithography, but at the same time also provides the basis for new technologies such as quantum cryptography [2].

Past 2Physics article by the author:
June 08, 2007: "Entanglement and One-Way Quantum Computing"
by Robert Prevedel and Anton Zeilinger


Over the years, the uncertainty principle has been reexamined and expressed in more general terms. To link uncertainty relations to classical and quantum information theory, they have been recast with the uncertainty quantified by the entropy [3,4] rather than the standard deviation. Until recently, the favored uncertainty relation of this type was that of Maassen and Uffink [5], who showed that it is impossible to reduce the so-called Shannon entropies (a measure for the amount of information that can be extracted from a system) associated with any pair of measurable quantum quantities to zero. This implies that the more you squeeze the entropy of one variable, the more the entropy of the other increases.

This intuition stood for some decades, however very recently a new work by Berta et al. [6] showed that the limitations imposed by Heisenberg’s principle could actually be overcome through clever use of entanglement, the counterintuitive property of quantum particles that leads to strong correlations between them. More precisely, the paper predicts that an observer holding quantum information about the particle can have a dramatically lower uncertainty than one holding only classical information. In the extreme case, an observer that has access to a particle that is maximally entangled with the quantum system that he wishes to measure is able to correctly predict the outcome of whichever measurement is chosen. This dramatically illustrates the need for a new uncertainty relation that takes into account the potential entanglement between the system and another particle. A derivation of such a new uncertainty relation appeared in the work of Berta et al. [6] (also see the past 2Physics article dated August 29, 2010) The new relation proves a lower bound on the uncertainties of the measurement outcomes when one of two measurements is performed.

To illustrate the main idea how an observer holding quantum information can outperform one without, the paper outlines an imaginary “uncertainty game” which we briefly outline below. In this game, two players, Alice and Bob, begin by agreeing on two measurements, R and S, one of which will be performed on an quantum particle. Bob then prepares this particle in a quantum state of his choosing. Without telling Alice which state he has prepared, he sends the particle to Alice. Alice performs one of the two measurements, R or S (chosen at random), and tells Bob which observable she has measured, though not the outcome of the measurement. The aim of the game is for Bob to correctly guess the measurement outcome. If Bob had only a classical memory (e.g. a piece of paper), he would not be able to guess correctly all of the time — this is what Heisenberg’s uncertainty relation implies. However, if Bob is able to entangle the particle he sends with a quantum memory, then for any measurement Alice makes on the particle, there is a measurement on Bob’s memory that always gives him the same outcome. His uncertainty has thus vanished and he is capable of correctly guessing the outcome of whichever measurement Alice performs.














Fig. 1: The uncertainty game. Bob sends a particle, which is entangled with one that is stored in his quantum memory, to Alice (1), who measures it using one of two pre-agreed observables (2). She then communicates the measurement choice, but not its result, to Bob who tries to correctly guess Alice’s measurement outcome. See text for more details. Illustration adapted from [6].

In our present work [7], we experimentally realize Berta et al.'s uncertainty game in the laboratory and rigorously test the new and modified uncertainty relation in an optical experiment. We generate polarization-entangled photon pairs and send one of the photons to Alice who randomly performs one of two polarization measurements. In the meantime, we delay the other photon using an optical fiber – this acts as a simple quantum memory. Dependent on Alice’s measurement choice (but not the result), we perform the appropriate measurement on the photon that was stored in the fiber. In this way, we show that Bob can infer Alice's measurement result with less uncertainty if the particles were entangled. Varying the amount of entanglement between the particles allows us to fully investigate the new uncertainty relation. The results closely follow the Berta et al. relation. By using entangled photons in this way, we observe lower uncertainties than previously known uncertainty relations would predict. We show that this fact can be used to form a simple, yet powerful entanglement witness. This more straightforward witnessing method is of great value to other experimentalists who strive to generate this precious resource. As future quantum technologies emerge, they will be expected to operate on increasingly large systems. The entanglement witness we have demonstrated offers a practical way to quantitatively assess the quality of such technologies, for example the performance of quantum memories.




















Fig. 2: A photo of the actual experiment. In the center a down-conversion source of entangled photons can be seen. The ultraviolet pump laser is clearly visible as it propagates through a Sagnac-type interferometer with the down-conversion crystal at the center. On the lower left, a small spool of fiber is light up by a HeNe laser. This fiber spool is a miniaturization of the actual spool that serves as the quantum memory in our experiment. [Copyright: R. Prevedel]

A similar experiment was performed independently in the group of G.-C. Guo, and its results were published in the same issue of Nature Physics [8] (Also see the past 2Physics article dated October 11, 2011).

References:
[1] Heisenberg, W. Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik. Z. Phys. 43, 172-198 (1927).
[2] Bennett, C. H. & Brassard, G. Proceedings of IEEE International Conference on Computers, Systems and Signal Processing, Bangalore, India, 175-179 (1984).
[3] Bialynicki-Birula, I. & Mycielski, J. "Uncertainty relations for information entropy in wave mechanics". Communications in Mathematical Physics, 44, 129-132 (1975). Abstract.
[4] Deutsch, D. "Uncertainty in quantum measurements". Physical Review Letters, 50, 631-633 (1983). Abstract.
[5] Maassen, H. & Uffink, J. B. "Generalized entropic uncertainty relations". Physical Review Letters, 60, 1103-1106 (1988). Abstract.
[6] Berta, M., Christandl, M., Colbeck, R., Renes, J. M. & Renner, R. "The uncertainty principle in the presence of quantum memory". Nature Physics, 6, 659-662 (2010). Abstract. 2Physics Article.
[7] Prevedel, R., Hamel, D.R., Colbeck, R., Fisher, K., & Resch, K.J. "Experimental investigation of the uncertainty principle in the presence of quantum memory and its application to witnessing entanglement", Nature Physics, 7, 757-761 (2011). Abstract.
[8] Li, C-F., Xu, J-S., Xu, X-Y., Li, K. & Guo, G-C. "Experimental investigation of the entanglement-assisted entropic uncertainty principle". Nature Physics, 7, 752-756 (2011). Abstract. 2Physics Article.

Labels:


Tuesday, October 11, 2011

Entanglement-Assisted Entropic Uncertainty Principle























[From left to right] Xiao-Ye Xu, Guang-Can Guo, Chuan-Feng Li and Jin-Shi Xu

Researchers at the University of Science and Technology of China (USTC) and National University of Singapore have verified the entanglement-assisted entropic uncertainty principle and demonstrated its practical usage to witness entanglement [1].

In quantum mechanics, the ability to predict the precise outcomes of two conjugate observables, such as the position and momentum, for a particle is restricted by the uncertainty principle. For example, the more precisely the location of the particle is determined, the less accurate the momentum determination will be. Originally given by Heisenberg, the uncertainty relation expressed in the form of standard deviation is further extended to the entropic form to precisely reflect its physical meanings.

The uncertainty principle is the essential characteristic of quantum mechanics. However, the possibility of violating the Heisenberg’s uncertainty relation has been considered early. In 1935, Einstein, Podolsky, and Rosen published the famous paper in which they considered using two particles entangled in position and momentum freedoms to violate the Heisenberg’s uncertainty relation and to challenge the correctness of quantum mechanics (EPR paradox) [2]. Popper also proposed a practical experiment using entangled pairs to demonstrate the violation of the Heisenberg’s uncertainty relation [3]. After a long debate and many experimental works, it is known that these violations do not contradict the quantum theory and they are now implemented as a signature of entanglement which is the fundamental feature of quantum mechanics and the important source of quantum information processing.

Recently, a stronger entropic uncertainty relation, which uses previously determined quantum information, was proved by Berta et al. [4], whose equivalent form was previously conjectured by Renes and Boileau [5]. By initially entangling the particle of interest to another particle that acts as a quantum memory, the uncertainty associated with the outcomes of two conjugate observables can be reduced to zero. The lower bound of the uncertainty is essentially dependent on the entanglement between the particle of interest and the quantum memory. This novel entropic uncertainty relation greatly extends the uncertainty principle.
























Image: The practical setup to demonstrate the uncertainty game proposed by Berta et al. (also see the past 2Physics article dated August 29, 2010).

The experimental group leaded by Prof. Chuan-Feng Li at USTC prepared a special kind of entangled photon state, called as the Bell diagonal state, in an entirely optical setup. One of the photons is sent for measurement and the other one acts as an assisted particle carrying the quantum information of the one of interest. The assisted photon is stored in a spin-echo based quantum memory which consisted of two polarization maintaining fibers each of 120 m length and two half-wave plates. The storing time can reach as long as 1.2 us. The lower bound of the uncertainty related to the outcomes of two conjugate observables is measured, which can be reduced to arbitrarily small values when the two particles share quasi-maximal entanglement. As a result, the entropic form of Heisenberg’s uncertainty relation is violated and the novel one is confirmed. By measuring observables on both particles, the group further used the novel entropic uncertainty relation to witness entanglement and to compare with other entanglement measurements. The novel entanglement witness can be obtained by a few separate measurements on each of the entangled particles, which shows its ease of accessibility.

The verified entropic uncertainty principle implies that the uncertainty principle is not only observable-dependent, but is also observer-dependent, providing a particularly intriguing perspective [6]. The method used to estimate uncertainties by directly performing measurements on both photons has practical application in verifying the security of quantum key distribution. This novel uncertainty relation would also find practical use in the area of quantum engineering.

The experimental investigation of the novel entropic uncertainty principle has caused great interests. Another relevant experimental work was performed independently by Prevedel and colleagues [7] and both papers are published in the same issue of Nature Physics.

References:
[1] C.-F. Li, J.-S. Xu, X.-Y. Xu, K. Li and G.-C. Guo, "Experimental investigation of the entanglement-assisted entropic uncertainty principle". Nature Physics, 7, 752-756 (2011). Abstract.
[2] A. Einstein, B. Podolsky and N. Rosen, "Can quantum mechanical description of physical reality be considered complete?" Phys. Rev. 47, 777-780 (1935). Abstract.
[3] K. R. Popper, "Zur Kritik der Ungenauigkeitsrelationen", Naturwiss. 22, 807-808 (1934). Article.
[4] M. Berta, M. Christandl, R. Colbeck, J. M. Renes and R. Renner, "The uncertainty principle in the presence of quantum memory". Nature Physics, 6, 659-662 (2010). Abstract. 2Physics Article.
[5] J. M. Renes and J. C. Boileau, "Conjectured strong complementary information tradeoff". Physical Review Letters, 103, 020402 (2005). Abstract.
[6] A. Winter, "Coping with uncertainty", Nature Phys. 6, 640-641 (2010). Abstract.
[7] R. Prevedel, D. R. Hamel, R. Colbeck, K. Fisher and K. J. Resch, Experimental investigation of the uncertainty principle in the presence of quantum memory and its application to witnessing entanglement, Nature Physics, 7, 757-761 (2011). Abstract.

Labels:


Sunday, September 11, 2011

The Quantum von Neumann Architecture: A Programmable Quantum Machine Based on a Quantum Random Access Memory and a Quantum Central Processing Unit

Matteo Mariantoni

Author: Matteo Mariantoni

Affiliation: Department of Physics, University of California at Santa Barbara, USA

A classical computer is based on both a hardware, i.e., a suitable set of micro-sized wires typically patterned on a silicon chip, and a software, i.e., a sequence of operations or code. Such operations are realized as electrical signals ‘running’ through the wires of the hardware. The combination of hardware and software is termed a computational architecture.

In the mid 1940's, John von Neumann, J. Presper Eckert, and John Mauchly revolutionized the abstract concept of a universal Turing machine by proposing the eponymous ‘von Neumann architecture.’ Their implementation, which involves a central processing unit and a memory to hold data and instructions, provided a practical approach to constructing a classical computer, a design that is at the heart of almost every computing device made today.

In the past decade, researchers in various fields, from nuclear magnetic resonance to quantum optics, from trapped ions to semiconductor- and superconductor-based quantum circuits, have been able to create and control in the lab many of the building blocks at the basis of what, in the near future, could represent a quantum computer. As a classical computer is based on bits, a quantum computer is based on quantum bits (qubits), 0 and 1. One of the main differences between classical bits and qubits is in that a qubit can be prepared in a so-called superposition state, where both states 0 and 1 are possible at the same time. In addition, in a quantum computer pairs of qubits can be prepared in ‘mixtures’ of two-qubit states, which are called entangled states. The immense power of a quantum computer resides in the combination of superposition states and entangled states. This combination will eventually allow us to perform calculations much faster than with any classical computer and to solve problems that would be otherwise impossible by classical means.

One of the critical challenges of quantum computing is to assemble together in a single machine all the hardware components needed for a quantum computer and to program these components using quantum codes, thus allowing us to implement a quantum-mechanical computational architecture. In particular, such an architecture should be scalable and immune from computational errors. This would represent a so-called scalable fault-tolerant quantum-mechanical architecture.

At University of California at Santa Barbara (UCSB), in the group headed by John Martinis and Andrew Cleland, we use superconducting quantum circuits as qubits. These are wires typically made of aluminum that, once cooled below a temperature of approximately -272 degrees Celsius become superconduting, thus drastically reducing dissipation effects and noise. When cooled further down to almost absolute zero temperature, our superconducting wires start showing a quantum mechanical behavior. In such a state, an immense number of electrons begins moving collectively, as a part of a single entity. Two different ‘positions’ of this immense number of electrons moving together can then be used to create the two states 0 and 1 of a qubit.

In the past years, at UCSB as well as in other labs worldwide we have shown that it is possible to prepare and control systems with a few qubits (up to three). In particular, we have shown superposition states and entangled states, and we have been able to perform simple quantum logic gates using one and two qubits.

However, qubits alone are insufficient to implement a quantum-mechanical analogous of the von Neumann architecture: A quantum memory is needed. In the experiment to be published in the journal Science [4], we were able to fabricate a fairly complex quantum circuit comprising all the elements of a quantum von Neumann machine, integrated on a single chip. Our quantum machine includes a set of two qubits that can exchange quantum information through a so-called quantum bus. We can address each qubit and prepare it in a superposition state, and we can entangle two qubits via the bus. The two qubits and the quantum bus represent the quantum central processing unit (quCPU) of our machine.

The quantum von Neumann machine (image of the real device): Two superconducting qubits (enclosed in the two central squares) are coupled through a quantum bus (center meander line). Quantum information can be stored in two quantum memories (two lateral meander lines). A zeroing register is included in the two central squares. Photo credit: Erik Lucero

Most importantly, we were able to provide each qubit with a quantum memory. A key characteristic of our quantum memories, which are also based on superconducting wires, is that they can hold quantum information for a much longer time than the corresponding qubits. In this manner, as soon as quantum information has been processed by the quCPU, it can safely be stored into the memories. The quCPU can then be used again to process more quantum information, while storing the original quantum information in the memories. The memories can store the original quantum information for a time long enough that, if that information is needed later in the computation, it can be read out and re-used by the quCPU at any desired time. We also provided our machine with the quantum-mechanical equivalent of a delete button, a so-called zeroing register, where used-up quantum information can be dumped from the quCPU, freeing it up. Our memories and zeroing register thus realize a true quantum random access memory (quRAM). We term the combination of quCPU and quRAM as the quantum von Neumann machine.

We tested our quantum von Neumann machine by running a proof-of-concept quantum code that makes use of both qubits, the coupling bus, the memories, and the zeroing register in a single long sequence. In addition, we ran two key algorithms for quantum information processing: The quantum Fourier transform and a Toffoli gate. The quantum Fourier transform is probably the most important block in Shor’s algorithm for the factorization of large numbers, while Toffoli gates are at the basis of measurement-free quantum error correction. The latter is necessary to show a fault-tolerant quantum-mechanical architecture.

The UCSB team of researchers was led by myself, Matteo Mariantoni, Elings prize fellow and postdoctoral fellow in the Department of Physics, Andrew N. Cleland, professor of physics, and John M. Martinis, professor of physics. Our USCB quantum computing team, composed by numerous students and postdocs, largely contributed to the infrastructure used in the experiments and in the development of the concept of the quantum von Neumann architecture.

I believe that our quantum-mechanical implementation of the von Neumann architecture will serve as a guideline in the further development of quantum computing, not only with superconducting quantum circuits, but also with trapped ions and semiconductor devices.

Dr. Matteo Mariantoni was supported in this work by an Elings Prize Fellowship in Experimental Science from UCSB’s California NanoSystems Institute. The work was performed under funding from the Army Research Office and by the Intelligence Advanced Research Projects Activity (IARPA). Devices were made at the UCSB Nanofabrication Facility, a part of the NSF-funded National Nanotechnology Infrastructure Network.

References:
[1] M. Mariantoni, H. Wang, R. C. Bialczak, M. Lenander, E. Lucero, M. Neeley, A. D. O’Connell, D. Sank, M.Weides, J.Wenner, T. Yamamoto, Y. Yin, J. Zhao, J. M. Martinis & A. N. Cleland, "Photon shell game in three-resonator circuit quantum electrodynamics". Nature Physics, 7, 287-293 (2011). Abstract.
[2] M. Neeley, R. C. Bialczak, M. Lenander, E. Lucero, M. Mariantoni, A. D. O’Connell, D. Sank, H.Wang, M.Waides, J.Wenner, Y. Yin, T. Yamamoto, A. N. Cleland & J. M. Martinis, "Generation of three-qubit entangled states using superconducting phase qubits", Nature, 467, 570–573 (2010). Abstract.
[3] L. DiCarlo, M. D. Reed, L. Sun, B. R. Johnson, J. M. Chow, J. M. Gambetta, L. Frunzio, S. M. Girvin, M. H. Devoret & R. J. Schoelkopf, "Preparation and measurement of three-qubit entanglement in a superconducting circuit", Nature, 467, 574-578 (2010). Abstract.

[4] Matteo Mariantoni, H. Wang, T. Yamamoto, M. Neeley, Radoslaw C. Bialczak, Y. Chen, M. Lenander, Erik Lucero, A. D. O’Connell, D. Sank, M. Weides, J. Wenner, Y. Yin, J. Zhao, A. N. Korotkov, A. N. Cleland1, John M. Martinis, "Implementing the Quantum von Neumann Architecture with Superconducting Circuits", Science, DOI: 10.1126/science.1208517 (published online September 1, 2011). Abstract.

Labels: ,


Sunday, September 04, 2011

Black Hole Evaporation Rates without Spacetime

Samuel L. Braunstein

Author: Samuel L. Braunstein

Affiliation: Professor of Quantum Computation, University of York, UK


Why black holes are so important to physics

In the black hole information paradox, Hawking pointed out an apparent contradiction between quantum mechanics and general relativity so fundamental that some thought any resolution may lead to new physics. For example, it has been recently suggested that gravity, inertia and even spacetime itself may be emergent properties of a theory relying on the thermodynamic properties across black hole event horizons [1]. All these paradoxes and prospects for new physics ultimately rely on thought experiments to piece together more detailed calculations, each of which themselves only give a part of the full picture. Our work "Black hole evaporation rates without spacetime" adds another calculation [2] which may help focus future work.

The paradox, a simple view

In its simplest form, we may state the paradox as follows: In classical general relativity, the event horizon of a black hole represents a point of no return - as a perfect semi-permeable membrane. Anything can pass the event horizon without even noticing it, yet nothing can escape, even light. Hawking partly changed this view by using quantum theory to prove that black holes radiate their mass as ideal thermal radiation. Therefore, if matter collapsed to form a black hole which itself then radiated away entirely as formless radiation then the original information content of the collapsing matter would have vanished. Now, information preservation is fundamental to unitary evolution, so its failure in black hole evaporation would signal a manifest failure of quantum theory itself. This "paradox" encapsulates a profound clash between quantum mechanics and general relativity.

To help provide intuition about his result Hawking presented a heuristic picture of black hole evaporation in terms of pair creation outside a black hole's event horizon. The usual description of this process involves one of the pair carrying negative energy as it falls into the black hole past its event horizon. The second of the pair carries sufficient energy to allow it to escape to infinity appearing as Hawking radiation. Overall there is energy conservation and the black hole losses mass by absorbing negative energy. This heuristic mechanism actually strengthens the "classical causal" structure of the black hole's event horizon as being a perfect semi-permeable (one-way) membrane. The paradox seems unassailable.

Scratching the surface of the paradox

This description of Hawking radiation as pair creation is seemingly ubiquitous (virtually any web page providing an explanation of Hawking radiation will invoke pair creation).

Nonetheless, there are good reasons to believe this heuristic description may be wrong [3]. Put simply, every created pair will be quantum mechanically entangled. If the members of each pair are then distributed to either side of the event horizon the so-called rank of entanglement across the horizon will increase for each and every quanta of Hawking radiation produced. Thus, one would conclude that just as the black hole mass were decreasing by Hawking radiation, its internal (Hilbert space) dimensionality would actually be increasing.

For black holes to be able to eventually vanish, the original Hawking picture of a perfectly semi-permeable membrane must fail at the quantum level. In other words, this "entanglement overload" implies a breakdown of the classical causal structure of a black hole. Whereas previously entanglement overload had been viewed as an absolute barrier to resolving the paradox [3], we argue [2,4] that the above statements already point to the likely solution.

Evaporation as tunneling

The most straightforward way to evade entanglement overload is for the Hilbert space within the black hole to "leak away". Quantum mechanically we would call such a mechanism tunneling. Indeed, for over a decade now, such tunneling, out and across the event horizon, has proved a useful way of computing black hole evaporation rates [5].

Spacetime free conjecture

In our paper [2] we suggest that the evaporation across event horizons operates by Hilbert space subsystems from the black hole interior moving to the exterior. This may be thought of as some unitary process which samples the interior Hilbert space; picks out some subsystem and ejects it as Hawking radiation. Our manuscript primarily investigates the consequences of this conjecture applied specifically to event horizons of black holes.

At this point a perceptive reader might ask how and to what extent our paper sheds light on the physics of black hole evaporation. First, the consensus appears to be that the physics of event horizons (cosmological, black hole, or those due to acceleration) is universal. In fact, it is precisely because of this generality that one should not expect this Hilbert space description of evaporation at event horizons to bear the signatures of the detailed physics of black holes. In fact, as explained in the next section we go on to impose the details of that physics onto this evaporative process. Second, sampling the Hilbert space at or near the event horizon may or may not represent fair sampling from the entire black hole interior. This issue is also discussed below (and in more detail in the paper [2]).

Imposing black hole physics

We rely on a few key pieces of physics about black holes: the no-hair theorem and the existence of Penrose processes. We are interested in a quantum mechanical representation of a black hole. At first sight this may seem preposterous in the absence of a theory of quantum gravity. Here, we propose a new approach that steers clear of gravitational considerations. In particular, we derive a quantum mechanical description of a black hole by ascribing various properties to it based on the properties of classical black holes. (This presumes that any quantum mechanical representation of a black hole has a direct correspondence to its classical counterpart.) In particular, like classical black holes our quantum black hole should be described by the classical no-hair properties of mass, charge and angular momentum. Furthermore, these quantum mechanical black holes should transform amongst each other just as their classical counterparts do when absorbing or scattering particles, i.e., when they undergo so-called Penrose processes. By imposing conditions consistent with these classical properties of a black hole we obtain a Hilbert space description of quantum tunneling across the event horizons of completely generic black holes. Crucially, this description of black hole evaporation does not involve the detailed curved spacetime geometry of a black hole. In fact, it does not require spacetime at all. Finally, in order to proceed to the next step of computing the actual dynamics of evaporation, we need to invoke one more property of a black hole: that of its enormous dimensionality.

Tunneling probabilities

The Hilbert space dimensionalities needed to describe a black hole are vast (at least 101077 for a stellar-mass black hole). For such dimensionalities, random matrix theory tells us that the statistical behavior of tunneling (as a sampling of Hilbert space subsystems) is excellently approximated by treating tunneling as a completely random process. This immediately imposes a number of symmetries onto our description of black hole evaporation. We can now completely determine the tunneling probabilities as a function of the classical no-hair quantities [2]. These tunneling probabilities are nothing but the black hole evaporation rates. In fact, these are precisely the quantities that are computed using standard field theoretic methods (that all rely on the curved black hole geometry). Thus, the calculation of tunneling probabilities provides a way of validating our approach and making our results predictive.

The proof of the pudding: validation and predictions

Our results reproduce Hawking's thermal spectrum (in the appropriate limit), and reproduce his relation between the temperature of black hole radiation and the black hole's thermodynamic entropy.

When Hawking's semi-classical analysis was extended by field theorists to include backreaction from the outgoing radiation on the geometry of the black hole a modified non-thermal spectrum was found [5]. The incorporation of backreaction comes naturally in our quantum description of black hole evaporation (in the form of conservation laws). Indeed, our results show that black holes that satisfy these conservation laws are not ideal but "real black bodies" that exhibit a non-thermal spectrum and preserve thermodynamic entropy.

These results support our conjecture for a spacetime free description of evaporation across black hole horizons.

Our analysis not only reproduces these famous results [5] but extends them to all possible black hole and evaporated particle types in any (even extended) gravity theories. Unlike field theoretic approaches we do not need to rely on one-dimensional WKB methods which are limited to the analysis of evaporation along radial trajectories and produce results only to lowest orders in ℏ.

Finally, our work quite generally predicts a direct functional relation exists between the irreducible mass associated with a Penrose process and a black hole's thermodynamic entropy. This in turn implies a breakdown in Hawking's area theorem in extended gravity theories.


And the paradox itself

The ability to focus on events horizons is key to the progress we have made in deriving a quantum mechanical description of evaporation. By contrast, the physics deep inside the black hole is more elusive. If unitarity holds globally then our spacetime free conjecture can be used to describe the entire time-course of evaporation of a black hole and to learn how the information is retrieved (see e.g., [6]). Specifically, in a unitarily evaporating black hole, there should exist some thermalization process, such that after what has been dubbed the black hole's global thermalization (or scrambling) time, information that was encoded deep within the black hole can reach or approach its surface where it may be selected for evaporation as radiation. Alternatively, if the interior of the black hole is not unitary, some or all of this deeply encoded information may never reappear within the Hawking radiation. Unfortunately, any analysis relying primarily on physics at or across the horizon cannot shed any light on the question of unitarity (which lies at the heart of the black hole information paradox).

The bigger picture

At this stage we might take a step back and ask the obvious question: Does quantum information theory really bear any connection with the subtle physics associated with black holes and their spacetime geometry? After all we do not yet have a proper theory of quantum gravity. However, whatever form such a theory may take, it should still be possible to argue, either due to the Hamiltonian constraint of describing an initially compact object with finite mass, or by appealing to holographic bounds, that the dynamics of a black hole must be effectively limited to a finite-dimensional Hilbert space. Moreover, one can identify the most likely microscopic mechanism of black hole evaporation as tunneling. Formally, these imply that evaporation should look very much like our sampling of Hilbert space subsystems from the black hole interior for ejection as radiation [2,4,6]. Although finite, the dimensionalities of the Hilbert space are immense and from standard results in random unitary matrix theory and global conservation laws we obtain a number of invariances. These invariances completely determine the tunneling probabilities without needing to know the detailed dynamics (i.e., the underlying Hamiltonian). This result puts forth the Hilbert space description of black hole evaporation as a powerful tool. Put even more strongly, one might interpret the analysis presented as a quantum gravity calculation without any detailed knowledge of a theory of quantum gravity except the presumption of unitarity [2].

Hints of an emergent gravity

Verlinde recently suggested that gravity, inertia, and even spacetime itself may be emergent properties of an underlying thermodynamic theory [1]. This vision was motivated in part by Jacobson's 1995 surprise result that the Einstein equations of gravity follow from the thermodynamic properties of event horizons [7]. For Verlinde's suggestion not to collapse into some kind of circular reasoning we would expect the physics across event horizons upon which his work relies to be derivable in a spacetime free manner. It is exactly this that we have demonstrated is possible in our manuscript [2]. Our work, however, provides a subtle twist: Rather than emergence from a purely thermodynamic source, we should instead seek that source in quantum information.


In summary, this work [2,4]:
  • shows that the classical picture of black hole event horizons as perfectly semi-permeable almost certainly fails quantum mechanically
  • provides a microscopic spacetime-free mechanism for Hawking radiation
  • reproduces known results about black hole evaporation rates
  • authenticates random matrix theory for the study of black hole evaporation
  • predicts the detailed black hole spectrum beyond WKB
  • predicts that black hole area must be replaced by some other property in any generalized area theorem for extended gravities
  • provides a quantum gravity calculation based on the presumption of unitarity, and
  • provides support for suggestions that gravity, inertia and even spacetime itself could come from spacetime-free physics across event horizons

References
[1] E. Verlinde, "On the origin of gravity and the laws of Newton", JHEP 04 (2011) 029. Abstract.
[2] S.L. Braunstein and M.K. Patra, "Black Hole Evaporation Rates without Spacetime", Phys. Rev. Lett. 107, 071302 (2011). Abstract. Article (pdf).
[3] H. Nikolic, "Black holes radiate but do not evaporate", Int. J. Mod. Phys. D 14, 2257 (2005). Abstract; S.D. Mathur, "The information paradox: a pedagogical introduction", Class. Quantum Grav. 26, 224001 (2009). Abstract.
[4] Supplementary Material to [2] at http://link.aps.org/supplemental/10.1103/PhysRevLett.107.071302.
[5] M.K. Parikh and F. Wilczek, "Hawking Radiation As Tunneling", Phys. Rev. Lett. 85, 5042 (2000). Abstract.
[6] S.L. Braunstein, S. Pirandola and K. Życzkowski, "Entangled black holes as ciphers of hidden information", arXiv:0907.1190.
[7] T. Jacobson, "Thermodynamics of Spacetime: The Einstein Equation of State", Phys. Rev. Lett. 75, 1260 (1995). Abstract.

Labels: , ,


Sunday, August 28, 2011

Quantum Spin Hall Effect for Light









Mohammad Hafezi, Eugene A. Demler, Mikhail D. Lukin, and Jacob M. Taylor [photos courtesy of Joint Quantum Institute (JQI) and Harvard University]


The advent of optical fibers a few decades ago made it possible for dozens of independent phone conversations to travel long distances along a single glass cable by, essentially, assigning each conversation to a different color—each narrow strand of glass carrying dramatic amounts of information with little interference.

Surprisingly, transmitting information-rich photons thousands of miles through fiber-optic cable is far easier than reliably sending them just a few nanometers through a computer circuit -- and this makes it difficult to employ photons as information carriers inside computer chips.. However, it may soon be possible to steer these particles of light accurately through microchips because of research [1] performed at the Joint Quantum Institute of the National Institute of Standards and Technology (NIST) and the University of Maryland, together with Harvard University.

The scientists behind the effort say the work not only may lead to more efficient information processors on our desktops, but also could offer a way to explore a particularly strange effect of the quantum world known as the quantum Hall effect in which electrons can interfere with themselves as they travel in a magnetic field. Manipulating photons such that they behave like their electrical counterparts- the electron- is a rich area of research with applications extending into quantum information and condensed matter. The corresponding physics is rich enough that its investigation has already resulted in three Nobel Prizes, but many intriguing theoretical predictions about it have yet to be observed.

Two researchers at the Joint Quantum Institute (JQI), Mohammad Hafezi and Jacob M. Taylor, and two researchers at Harvard, Eugene A. Demler and Mikhail D. Lukin, propose an optical delay line that could fit onto a computer chip. Delay lines, added to postpone a photon’s arrival, are passive, but critical in processing signals. Kilometers of glass fiber are easily obtained, but fabricating optical elements that can fit on a single chip creates defects that can lead to reduced transmission of information.

"We run into problems when trying to use photons in microcircuits because of slight defects in the materials chips are made from," says Jacob Taylor, a theoretical physicist at JQI. "Defects crop up a lot, and they deflect photons in ways that mess up the signal."

These defects are particularly problematic when they occur in photon delay devices, which slow the photons down to store them briefly until the chip needs the information they contain. Delay devices are usually constructed from a single row of tiny resonators, so a defect among them can ruin the information in the photon stream. But the research team perceived that using multiple rows of resonators would build alternate pathways into the delay devices, allowing the photons to find their way around defects easily.

Artist's rendering of the proposed JQI fault-tolerant photon delay device for a future photon-based microchip. The devices ordinarily have a single row of resonators; using multiple rows like this provides alternative pathways for the photons to travel around any physical defects. Transmission of light is protected from defects because the system exhibits a photonic version of the quantum spin Hall effect. [Image credit: E. Edwards/JQI]

As delay devices are a vital part of computer circuits, the alternate-pathway technique may help overcome obstacles blocking the development of photon-based chips, which are still a dream of computer manufacturers. While that application would be exciting, lead author Mohammad Hafezi says the prospect of investigating the quantum Hall effect with the same technology also has great scientific appeal.

"The photons in these devices exhibit the same type of interference as electrons subjected to the quantum Hall effect," says Hafezi. "We hope these devices will allow us to sidestep some of the problems with observing the physics directly, instead allowing us to explore them by analogy."

Quantum Hall physics is the remarkable phenomenon at the heart of this new approach. The quantum Hall effect occurs in a two-dimensional sea of electrons under the influence of a large magnetic field. The electrons are allowed to travel along the edges of the material but do not have enough energy to permeate throughout the bulk or central regions. It is as if there are conduction highways along the edge of the material. Even if there are defects in the material, like potholes in the road, electrons still make it to their destination.

These highways, called “edge states” are open for transit only at specific values of the externally applied magnetic field. Because the routes are so robust against disorder and reliably allow for electron traffic, this effect provides a standard for electrical resistance.

In recent years, scientists have discovered that some materials can exhibit what is known as the quantum spin Hall effect (QSHE), which depends on the “spin” attributes of the electron. Electrons not only carry charge, but also “spin.” Electrons can be thought of as tiny spinning tops that can rotate clockwise (in which case they are in a “spin-up” condition) or counter clockwise (“spin-down”). Notably, the robust edge states are present in the QSHE even without externally applied magnetic fields, making them amenable for developing new types of electronics.

In the Nature Physics article, the JQI-Harvard team is proposing a device supporting these “edge states” that are a hallmark of the QSHE, where light replaces the electrons. This device can be operated at room temperature and does not require any external magnetic field, not even the use of magnetic materials. They show that the resilience of the edge states can be used to engineer novel optical delay lines at the micrometer scale.

Hafezi explains that a key step is confining the photon pathways to two-dimensions: “In the QSHE, electrons move in a two-dimensional plane. Analogously, one can imagine a gas of photons moving in a two dimensional lattice of tiny glass racetracks called resonators.”

Resonators are circular light traps. Currently one-dimensional lines of these micro-racetracks can be used for miniaturized delay lines. Light, having particular colors (in other words, frequencies), can enter the array and become trapped in the racetracks. After a few swings around, the photons can hop to neighboring resonators. The researchers propose to extend this technology and construct a two-dimensional array of these resonators (see Figure).

Once light is in the array, how can it enter the quantum edge highway? The secret is in the design of the lattice of resonators and waveguides, which will determine the criteria for light hopping along the edge of the array rather than through the bulk. The photons will pile into the edge state only when the light has a particular color.

The fabrication process for these micro-resonators is susceptible to defects. This is true for both one- and two-dimensional resonator arrays, but it is the presence of quantum edge states that reduces loss in signal transmission.

When photons are in an edge state created by the 2D structure their transmission through the delay line is protected. Only along these highways will they will skirt around defects, unimpeded. They cannot do a U-turn upon encountering a defect because they do not have the appropriate light frequency, which is their ticket to enter the backwards-moving path.

Taylor explains an advantage of their proposal: “Right around the point where other [1D] technologies become operational, this same 2D technology also becomes operational. But thereafter, the transmitted signal will be much more robust for this approach to delay lines compared to the 1D approach.”

For example, the length of delay is given by the size of the array or the length of the photon’s path, whether 1D or 2D. However, as the number of resonators and optical features increases to accommodate longer delays, the inherent defects will eventually cause a roadblock for the photons, while the transmission using quantum pathways remains unobstructed.

The researchers hope that building these simple passive devices will lay the foundation for creating robust active circuit elements with photons, such as a transistor.

Reference:
[1] Mohammad Hafezi, Eugene A. Demler, Mikhail D. Lukin, and Jacob M. Taylor, "Robust optical delay lines via topological protection", Nature Physics, doi:10.1038/nphys2063 (Published online August 21, 2011). Abstract.


[We thank Joint Quantum Institute of the National Institute of Standards and Technology (NIST) and the University of Maryland for materials used in this report]

Labels: