.comment-link {margin-left:.6em;}

2Physics

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, April 15, 2012

How Quantum Physics Could Make 'The Matrix' More Efficient

Mile Gu of CQT, Singapore

Researchers have discovered a new way in which computers based on quantum physics could beat the performance of classical computers. The work -- by researchers from Centre for Quantum Technologies (CQT), Singapore and University of Bristol, UK -- implies that a 'The Matrix'-like simulation of reality [1] would require less memory on a quantum computer than on a classical computer. It also hints at a way to investigate whether a deeper theory lies beneath the quantum theory. The finding is published in Nature Communications dated March 27th, 2012 [2].


Karoline Wiesner of the University of Bristol, UK -->


The finding emerges from fundamental consideration of how much information is needed to predict the future. Mile Gu, Elisabeth Rieper and Vlatko Vedral at CQT, with Karoline Wiesner from the University of Bristol, UK, considered the simulation of 'stochastic' processes (see Figure 1), where there are several possible outcomes to a given procedure, each occurring with a calculable probability. Many phenomena, from stock market movements to the diffusion of gases, can be modeled as stochastic processes.


















Figure 1: A simulator for a Stochastic Process can be thought of a physical system that stores select information about past outputs, and uses them to generate the require statistics for the future. Ideally, we want to construct a simulator that is as simple as possible, such that the amount of information it requires about the past is minimized.


The details of how to simulate such processes have long occupied researchers. The minimum amount of information required to simulate a given stochastic process is a significant topic of study in the field of complexity theory. In scientific literature of that field, it is referred to as 'statistical complexity' [3].

Elisabeth Rieper of CQT, Singapore -->

Researchers know how to calculate the amount of information transferred inherently in any stochastic process, a quantity known as the excess entropy. Theoretically, this sets the lowest amount of information needed to simulate the process. In reality, however, classical simulations of stochastic processes require more storage than this.

Mile, Karoline, Elisabeth and Vlatko showed that quantum simulators need to store less information than the optimal classical simulators. This was done by isolating the source of inefficiency within classical simulators. When we attempt to simulate any stochastic process that has a binary property ‘X’ , which affects the future evolution of the process, then the value ‘X’ must be stored. This is unavoidable; even when all future observations of the process do not guarantee that one can deduce the value of ‘X’. The researchers showed that this implied that classical simulators erase information; they contain a source of irreversibly that cannot be removed.

Quantum simulators, however, have greater potential freedom. Instead of allocating a full bit to store the value of ‘X’, one can store the conditions ‘X= 0’ and ‘X=1’ in non-orthogonal states. Consequently, the simulator saves memory, as it was never sure what state the property was in the first place. Nevertheless, Mile et al showed that it is often possible to engineer dynamics such that the simulator can still replicate the dynamics of our desired reality.

Vlatko Vedral of CQT, Singapore

This result has significant implications. Stochastic processes play a ubiquitous role in the modeling of dynamical systems that permeate quantitative science, from climate fluctuations to chemical reaction processes. Classically, the statistical complexity is employed as a measure of how much structure a given process exhibits. The rationale is that the optimal simulator of such a process requires at least this much memory. The fact that this memory can be reduced quantum mechanically implies the counter-intuitive conclusion that quantizing such simulators can reduce their complexity beyond this classical bound, even if the process they're simulating is purely classical. Many organisms and devices operate based on the ability to predict and thus react to the environment around them. The possibility of exploiting quantum dynamics to make identical predictions with less memory implies that such systems need not be as complex as one originally thought.






















Figure 2: The researchers calculated the information-storage requirements for a set of stochastic models describing a perturbed coin - a coin that flips at each time-step with some probability p. The figure shows how a classical simulation (a) compares to a quantum simulation (b) and the expected ideal (c) for different probabilities p. The quantum model is closer to the ideal than classical, but there's still a gap.

What surprised the researchers is that the quantum simulations are still not as efficient as they could be: they still have to store more information than the process would seem to need (see figure 2). Could an even more general probability theory -- with even more bizarre correlations -- side-step this restriction?

"What's fascinating to us is that there is still a gap. It makes you think, maybe here's a way of thinking about a theory beyond quantum physics," says Vlatko.

Reference
[1] The Matrix (1999 American science fiction action film written and directed by Larry and Andy Wachowski). Wikipedia Page.
[2] Mile Gu, Karoline Wiesner, Elisabeth Rieper and Vlatko Vedral, "Quantum mechanics can reduce the complexity of classical models." Nature Communications 3, 762 (2012). Abstract. arXiv:1102.1994.
[3] Cosma Rohilla Shalizi and James P. Crutchfield, "Computational mechanics: pattern and prediction, structure and simplicity." Journal of Statistical Physics, 104, 817–879 (2001). Abstract.

Labels: ,


Sunday, April 08, 2012

Strong Coupling and Its Dynamic Control of Distant Nanocavities

Susumu Noda (left) and Yoshiya Sato (right)



Authors: Susumu Noda and Yoshiya Sato

Affiliation: Department of Electronic Science and Engineering, Kyoto University, Japan

Noda's Quantum Optoelectronics Laboratory >>


The dynamic manipulation of photons in nanostructures is essential for various applications including advanced photonic circuits, stopping light, and quantum information processing. In particular, the formation and dynamic control of a coupled state among on-chip, photonic nanoelements at arbitrary positions should have great impact on these directions. However, when we couple individual nanocavities, they must be placed in close proximity since the light is confined so tightly in each cavity that their evanescent fields extends only a few microns. With this limitation, a nanocavity can only couple to adjacent nanocavities, which restricts the architecture of the system and makes it difficult to achieve on-demand dynamic control of the coupled states.

In our recent work, we have obtained strong coupling between nanocavities even though they are separated by a large distance (> 80 μm) and achieve dynamic control over their coupling to freeze the photon state on demand. Photonic states can now be separated without being isolated, opening the door to the development of advanced functional photonic circuits in scalable classical and quantum information processing. These results have recently been published in the journal "Nature Photonics"[1]





















Fig. 1.a,
Schematic model of two indirectly coupled photonic nanocavities through a waveguide with reflecting boundary walls. b, The resonant spectrum of the isolated individual nanocavities A and B (red line) and their resonant spectrum when coupled to an open waveguide (dashed green line). The green line spectrum has a line width of δin, which corresponds to the coupling bandwidth between the nanocavities and the waveguide. c, The resonant spectrum of the bounded waveguide discretized by Fabry-Perot (FP) resonant effect.

At first, we discuss how to realize strong coupling between distant nanocavities. We employ a system as shown in Fig. 1. Two nanocavities (A and B) are connected by a waveguide, which is terminated on both sides by reflecting walls (C and D). The individual nanocavities each have a single resonant mode with the same frequency (see Fig. 1b, red line). The bounded waveguide has many standing wave modes due to Fabry-Perot (FP) resonance (see Fig. 1c). To realize strong cavity-cavity coupling, we theoretically investigated the system in detail and found that all FP waveguide modes should be detuned far from the nanocavity modes (by more than the coupling bandwidth δin between the nanocavities and the waveguide) as shown in Fig. 1b and c. Even under such a condition, the nanocavities can still couple to each other indirectly through a forced oscillation of the FP waveguide modes, while concentrating photons in either nanocavity, not the waveguide.

Fig. 2. a, Overall view of the fabricated silicon-based photonic crystal observed by SEM. Two photonic crystal nanocavities (A & B) are placed 202a apart with a line defect waveguide nearby. b, Magnified image of a nanocavity. This is based on a multi-step hetero structure with a1 of 415 nm and a2 of 420 nm. c, Magnified image of the waveguide and a partial reflector. d, e Spectra of the vertically emitted light observed at cavities A (d) and B (e) respectively, obtained by introducing a tunable continuous-wave laser through partial reflector C. f, Time resolved amplitude of vertically emitted light from cavities A & B. A pulse laser with duration of 4 ps and a centre wavelength of ~ 1539.45 nm (width = 1 nm) was introduced through the partial reflector C, and vertically emitted light from each cavity was observed in the time domain by a cross-correlation method.

According to this scheme, we fabricated a silicon-based photonic crystal sample as shown in Fig. 2a-c. Two multistep-hetero nanocavities[2], A and B, with original Q factors of ~ 1 million, were placed 83µm apart, with a line defect waveguide nearby. Both ends of the waveguide were bound by partial reflectors (C and D), formed by narrowing the waveguide’s width. Figure 2d, e show resonant spectra of the fabricated sample observed from nanocavity A and B, respectively. Two resonant peaks with similar intensities were observed from both cavities with exactly the same wavelengths (1539.39 and 1539.54 nm). These peaks correspond to the coupled nanocavity modes. The splitting of the peaks (150 pm) is 50 times larger than the resonant peak’s width (~3pm), indicating that the system is within the strong coupling regime.

Next, we carried out time domain measurements. We excited the coupled cavity modes by introducing a short optical pulse through partial reflector C. The results are shown in Fig. 2f. Clear exchange of photons between the distant nanocavities was observed with a period of ~54 ps. This exchange is seen to continue more than 400 ps, demonstrating the long coherence time of photons in this system. Note that the photon lifetime is much larger than that of the FP waveguide modes (~40ps). This indicates that the photons are predominantly concentrated in the nanocavities rather than in the waveguide when the coupled nanocavity modes are excited.























Fig. 3.
Dynamic control of the coupling state between the nanocavities.
a, Schematic of the experimental set-up. A control pulse is irradiated into cavity B, causing a dynamic wavelength shift by the carrier plasma effect. b, Time-resolved amplitude of emitted light from cavities A and B, where the control pulse was irradiated into cavity B when the photons populated only cavity A.

The next important step was to demonstrate dynamic control over these coupling states. Because the nanocavities are sufficiently far apart, it is possible to induce a dynamic change in either cavity, completely independently, to control the coupling state. Here, we attempted to induce a wavelength shift of cavity B using a control pulse with duration of 4 ps, and a wavelength of 770 nm at a specified time during the photon exchange. (See Fig. 3a) The control pulse is absorbed by the cavity, generating free carriers, which lowers the refractive index and induces a blueshift of the resonant wavelength [3]. This blueshift cuts off the coupling between cavities A and B, and this decoupled state continues for ~1 ns due to the long carrier lifetime in silicon. Figure 3b shows the results obtained for the control pulse irradiation. The control pulse is irradiated onto the cavity B with the photons populating only cavity A. This figure clearly shows that the photon exchange was stopped successfully, and froze the photon population in the state at the moment of control pulse irradiation. This finding suggests that the behaviour of photons can be controlled even in regions where they are not present which therefore enables remote control of photons. We have also observed similar phenomena even at single photon power levels.

The results obtained in this work are expected to be applicable to various nanophotonic circuits that require distant coupling of on-chip cavities and will become a fundamental building block for areas including the stopping (or slowing) of light and even photonic quantum information processing.

References
[1] Y. Sato, Y. Tanaka, J. Upham, Y. Takahashi, T. Asano, and S. Noda “Strong coupling between distant photonic nanocavities and its dynamic control”. Nature Photonics. 6, 56-61 (2012). Abstract.
[2] Yasushi Takahashi, Yoshinori Tanaka, Hiroyuki Hagino, Tomoyuki Sugiya, Yoshiya Sato, Takashi Asano, and Susumu Noda, "Design and demonstration of high-Q photonic heterostructure nanocavities suitable for integration". Optics Express. 17, 18093-18102 (2009). Abstract.
[3] Yoshinori Tanaka, Jeremy Upham, Takushi Nagashima, Tomoaki Sugiya, Takashi Asano & Susumu Noda, "Dynamic control of the Q factor in a photonic crystal nanocavity". Nature Materials, 6, 862-865 (2007). Abstract.

Labels: ,


Sunday, March 18, 2012

Quantum Interface between an Electrical Circuit and a Single Atom













[From Left to Right] Jacob Taylor of the Joint Quantum Institute, USA; D. Kielpinski of the Centre for Quantum Dynamics at Griffiths University, Australia; G. J. Milburn of the Centre for Engineered Quantum Systems, Australia.

If a practical quantum computer is ever to be realized, conventional electronic devices will have to interface with the delicate quantum systems such as atoms or ions in traps or wisps of magnetism near superconducting sensors. A new paper in the journal Physical Review Letters [1], written by experimenters at several Australian Universities and theorists at the Joint Quantum Institute (JQI) in the USA, proposes one way to achieve this kind of quantum interface. It shows how an electrical circuit can be used to enable the transfer of quantum information encoded in a single ion to other quantum systems, such as another isolated ion.

The authors of the new paper include Jacob Taylor and D. Kafri of the Joint Quantum Institute, USA (operated jointly by the National Institute of Standards and Technology and the University of Maryland); D. Kielpinski of the Centre for Quantum Dynamics at Griffiths University, Australia; and M. J. Woolley and G. J. Milburn of the Centre for Engineered Quantum Systems, Australia.

Electronic circuits are good at conveying information and, through the use of logic gates, at processing information. But quantum information, by contrast, is more delicate and needs to be handled with care. The reason for the delicacy of quantum transactions is that quantum states are easily undone. For example, the atom’s spin orientation or an internal electron energy value might be altered if the atom interacts with other atoms. The atom bearing the quantum information has to be buffered from the surrounding environment in order for the full power of quantum information processing to be achieved.

Fig.1: Model for ion-circuit coupling. C stands for capacitance and L stands for inductance.[Image Credit: Authors of the PRL paper]

Another obstacle in this case is the vastly different time scales. Here a circuit in which electric currents oscillate at gigahertz (109 Hz) frequencies must talk to an ion (confined by electrodes) oscillating at megahertz (106 Hz) frequencies. Thus the circuit and the ion lying just offshore inhabit very different worlds. The circuit represents the world of electronics, where information is typically stored in the form of accumulations of charge in a transistor or accumulations of magnetism on a sector of hard drive, while the ion harbors quantum information, typically in the form of internal atomic attributes.

Actually this is not a typical electric circuit. It would be kept in a bath of liquid helium at a temperature near absolute zero so as to reduce electrical noise as much as possible. The electrical and magnetic energy sloshing back and forth in the circuit can be thought of as a microwave photon. The atom and the microwave don’t directly interact. Rather they relate to each other through the action of a bulk acoustic wave, or BAW. This acoustic wave of energy moves through a frail slip of matter with a frequency equal to the difference in the microwave and ion frequencies.

What this setup allows observers to do is this: the oscillation of the microwave can be used instead of the ion’s motion as a proxy for the ion’s internal quantum information. The microwave in the circuit can, in turn, provide a handy linkup with other quantum systems, such as ions held in other atom traps. The next figure illustrates how this works.

Fig.2: Quantum communication fiber link enabled by circuit-ion couplings in two independent cryogenic environments [Image Credit: Authors of the PRL paper]

The circuit doesn’t read out or encode the quantum information in an ion (that’s usually done with lasers) but acts as a go-between other ions or quantum bits (qubits) stored in other devices. The information transfer between the electrical system and the ion enables the transfer of quantum information between two distant ions. Such linkups, in the form of entanglement or teleportation, are the heart of future quantum computers. This is illustrated in the second figure, where quantum information will flow from one isolated ion to another via a fiber. This transfer is enabled, however, by the presence of the electronic circuits.

For the circuit-ion linkup to work, the transfer has to happen before the ion’s quantum information leaks away, a time referred to as its decoherence time. “We expect that the coupling between ion and circuit can occur much faster than the ion’s decoherence time, which is about a millisecond,” said JQI scientist Jacob Taylor, one of the authors of the new paper. “We were pleasantly surprised given how simpler approaches give a much weaker coupling.”

Reference
[1] D. Kielpinski, D. Kafri, M. J. Woolley, G. J. Milburn, and J. M. Taylor, "Quantum Interface between an Electrical Circuit and a Single Atom" (To be published in Physical Review Letters). arXiv:1111.5999.

Labels:


Sunday, March 04, 2012

A Breakthrough of Scalable Quantum Computing: First experimental demonstration of topological error correction

Yu-Ao Chen (left) and Jian-Wei Pan (right)

Scientists at the Division of Quantum Physics and Quantum Information (QPQI, Shanghai Branch, National Laboratory for Physical Sciences at Microscale, University of Science and Technology of China) -- in close collaboration with theoretical physicists from University of Melbourne (Australia) and University of British Columbia (Canada) -- have for the first time demonstrated topological error correction with eight-photon cluster state [1]. It represents an essential first step in that direction of building large-scale quantum computers.

Quantum computers have the potential to solve numerical problems that would be impossible on a classical computer. However quantum computing is very fragile. The imperfection of realistic physical devices inevitably introduces errors and destroys information. Fortunately, quantum error correction can be implemented, which enables quantum computers to tolerate an error rate per quantum bit (qubit) up to a threshold. For error rate below this threshold, quantum computing is efficiently possible and the results remain unaffected.

Topological error correction [2,3] is a quantum-error-correction scheme that makes use of topological properties in three dimensions [4] as well as active error correction method. The topological feature makes the quantum computing fault-robust: many errors on physical devices won't be “seen” by logical qubits and do not affect computing results. Moreover, while performing quantum computing, one can actively analyzes the syndromes from measurements and monitors whether and what errors occur. If so, error correction is immediately implemented.

In comparison with conventional error correction in quantum computing, topological error correction has the highest known tolerable error rate 0.7-1.1% (for the traditional error correction codes the highest threshold is 2.2×10-5), and is more friendly to realistic physical devices that are imperfect and unavoidably suffer from errors. Moreover, the architecture used in topological error correction is rather simple: it is sufficient to create interactions between two qubits that are neighbors of each other. Thus, topological error correction can increase the tolerance for experimental errors to the point that it is consistent with experimental capabilities. This greatly increases the prospects for building large-scale quantum computers. The experiment [1] provides a proof of principle that topological error correction would be one of the most practical approaches for designing quantum computers.

Figure 1 (a) The structure of the prepared eight photon cluster state and the topological feature. (b) The experimental setup for the eight photon cluster state.

In the experiment, physicists have designed an eight photon cluster state (See Fig. 1a) to achieve the topological error correction. Based on a newly developed ultrabright entangled-photon source by using an interferometric Bell-type synthesizer and a noise-reduction interferometer [5] that utilizes spontaneous parametric down-conversion and linear optics, the desired eight-photon cluster state is created with high fidelity (See Fig. 1b). Then each qubit is measured locally. Error syndromes are constructed from the measurement outcomes, and it is shown that a correlation can be protected against a single error on any qubit. If only one physical qubit suffers an error, the faulty qubit can be located and corrected, and that if all qubits are simultaneously subjected to errors with equal probability, the effective error rate is significantly reduced by error correction. This constitutes a proof-of-principle experiment that demonstrates the viability of topological error correction, a central ingredient in topological cluster-state computing.





















Figure 2: Artist’s view showing the working principle of topological cluster-state quantum computing. Each lantern resembles the topological structure of the eight-photon cluster state which realizes one topologically protected qubit.
























Figure 3: Set-up for topological error correction.

The demonstration of topological error correction is a breakthrough step towards scalable fault-tolerant quantum computation. The high threshold error rate is especially remarkable given that only nearest-neighbor interactions are required. Owing to these advantages, topological error correction is especially well suited for physical systems that are geometrically constrained to nearest-neighbor interactions, such as quantum dots, Josephson junction, ion traps, cold atoms in optical lattices and photonic modules.

Reference
[1] X.-C. Yao, T.-X. Wang, H.-Z. Chen, W.-B. Gao, A. G. Fowler, R. Raussendorf, Z.-B. Chen, N.-L. Liu, C.-Y. Lu, Y.-J. Deng, Y.-A. Chen & J.-W. Pan, “Experimental demonstration of topological error correction” , Nature 482, 489 (2012). Abstract.
[2] R. Raussendorf, & J. Harrington, “Fault-tolerant quantum computation with high threshold in two dimensions”, Phys. Rev. Lett. 98, 190504 (2007). Abstract.
[3] D. S. Wang, A. G. Austin, & L. C. L. Hollenberg, “Quantum computing with nearest neighbor interactions and error rates over 1%”, Phys. Rev. A 83, 020302(R) (2011). Abstract.
[4] C. Nayak, S. H. Simon, A. Stern, M. Freedman, & S. Das Sarma, “Non-Abelian anyons and topological quantum computation”, Rev. Mod. Phys. 80, 1083–1159 (2008). Abstract.
[5] X.-C. Yao, T.-X. Wang, P. Xu, H. Lu, G.-S. Pan, X.-H. Bao, C.-Z. Peng, C.-Y. Lu, Y.-A. Chen, J.-W. Pan, “Observation of eight-photon entanglement”, Nature Photonics, doi:10.1038/nphoton.2011.354 (published online February 12, 2012). Abstract.

Labels:


Sunday, February 05, 2012

Experimental Demonstration of a Generalized Uncertainty Relation

Experimental team (from left to right): Jacqueline Erhart, Stephan Sponar, Yuji Hasegawa, and Georg Sulyok

Authors: Jacqueline Erhart, Stephan Sponar, Georg Sulyok, and Yuji Hasegawa

Affiliation:Atominstitut, Vienna University of Technology, Austria

Heisenberg’s uncertainty principle is certainly one of the most famous foundations of quantum physics. According to this principle, not all properties of a quantum particle are determined with arbitrary accuracy. In the early days of quantum theory, this has often been justified by the notion that every measurement inevitably recoils the quantum particle, which disturbs the results of any further measurements. This, however, turns out to be an oversimplification. In neutron experiments carried out by professor Yuji Hasegawa and his team at Vienna University of Technology, different sources of quantum uncertainty could now be distinguished, validating theoretical results by collaborators from Japan. The influence of the measurement on the quantum system is only one reason for the uncertainty observed in the experiment. Heisenberg’s arguments for the uncertainty principle have to be revisited – the uncertainty principle itself however remains valid. The results have now been published in the journal “Nature Physics” [1].

Yuji Hasegawa of Atominstitut, Austria (Left) and Masanao Ozawa of Nagoya University, Japan (Right)

Position or Momentum – But Never Both

It is well established that some physical quantities cannot be measured at the same time. The question is, how this fact has to be interpreted. “Heisenberg’s famous thought experiment about using light (γ-rays) to measure the position of an electron is still quoted today”, says Jacqueline Erhart from the Institute for Atomic and Subatomic Physics at the Vienna University of Technology. To measure the position of a particle with high precision, light with a very short wavelength (and therefore high energy) has to be used. This results in momentum being transferred to the particle – the particle is kicked by the light. Therefore, Heisenberg argued, it is impossible to measure both position and momentum accurately [2]. The same is true for other pairs of physical quantities. Heisenberg believed that in these cases, an error in one measurement leads to an inevitable disturbance of the other measurement. The product of error and disturbance, Heisenberg claimed, cannot be smaller than a certain threshold.

Nature is Uncertain – Even Without Measurements

However, the effect of the measurement on the quantum system and the resulting disturbance of the second measurement is not the core of the problem. “Such disturbances are also present in classical physics – they are not necessarily linked to quantum physics”, Stephan Sponar (Vienna UT) explains. The uncertainty is rooted in the quantum nature of the particle. Quantum particles cannot be described like a point-like object with a well-defined velocity. Instead, quantum particles behave as a wave – and for a wave, position and momentum cannot be defined accurately at the same time. One could say: the particle itself does not even “know” where exactly it is and how fast it travels – regardless of the particle being measured or not.
A Generalized Uncertainty Relation – Taking the Measurement into Account
“In order to describe the fundamental uncertainty and the additional disturbance due to the measuring process, both particle and measurement device have to be treated in the framework of quantum theory”, says Georg Sulyok (Vienna UT). This was done by the Japanese physicist professor Masanao Ozawa in 2003 [3], leading to a generalized uncertainty principle. His equations contain different “kinds of uncertainty”: On the one hand the uncertainty which comes from the measurement, as it disturbs the particle (this is the uncertainty described in Heisenberg’s thought experiment of the position-momentum-measurement), on the other hand the equations contain the fundamental quantum uncertainty, which is present in any quantum system, regardless of the measurement.

Neutrons and their Spin

A sophisticated experimental design now made it possible to study these contribution to uncertainty at the Vienna University of Technology. Instead of a particle’s position and momentum, the spin of neutrons was measured. Neutrons constitute an ideal candidate for basic research in quantum mechanics [4]. Due to their weak interaction with the environment they behave as a robust quantum system. In addition, they can be experimentally controlled with high precision. Especially for the manipulation of the neutron’s spin a variety of elaborate techniques and devices has been developed over the last decades. The behavior of the neutron’s spin is not only exploited in the wide field of material science but also for studying fundamental quantum-mechanical characteristics, for example superposition or entanglement.

The spin in x-direction and the spin in y-direction cannot be measured simultaneously: they fulfill the uncertainty relation, in much the same way as position and momentum. To demonstrate the generalized uncertainty relation, an apparatus consisting of three stages has been designed. A schematic illustration of our setup is depicted in Fig.1. In the first step, the neutron’s spin state is prepared. Then, measurements of the x- and y-component of the spin are performed successively on every incoming neutron. The first measurement therefore unavoidably disturbs the second measurement. This effect adds up to the fundamental uncertainty stemming from the quantum nature of the neutron. Preparation and measurement of the neutron’s spin is achieved by exploiting appropriately oriented static magnetic fields together with polarizing supermirrors. Carrying out measurements with small, well-defined tunings in the apparatus, we could study the interplay between different sources of uncertainty.

















Fig 1. Experimental apparatus for demonstration of the generalized uncertainty principle for error and disturbance in neutron spin measurements. Our setup consists of three stages: preparation (blue region), measurement of the x-direction (red region) and y-direction (yellow region) of the neutron’s spin. The neutron beam is polarized by passing through a supermirror spin polarizer and the static magnetic fields, produced by the coils C 1 – C 4, in combination with the analyzing supermirrors (spin-filters) are used to measure the x- and y-components of the neutron’s spin. Error and disturbance are determined from detected count rates of the successive measurements.


Arbitrarily Small Disturbance

“The smaller the error in one measurement, the larger the disturbance of the other – this rule still holds. But the product of error and disturbance can be made arbitrarily small – even smaller than Heisenberg’s original formulation of the uncertainty principle would allow”, says professor Yuji Hasegawa. Only in combination with the other uncertainty terms occurring in Ozawa’s generalized relation the lower bound is exceeded in all experimental realizations.

But even if two measurements hardly influence each other: quantum physics remains “uncertain”. “The uncertainty principle is of course still true”, the researchers assure. “But the uncertainty does not always come from the disturbing influence of the measurement, but from the quantum nature of the particle itself.”

References:
[1] J. Erhart, S. Sponar, G. Sulyok, G. Badurek, M. Ozawa, Y. Hasegawa, "Experimental demonstration of a universally valid error–disturbance uncertainty relation in spin measurements", Nature Physics (2012), Published online, doi:10.1038/nphys2194. Abstract.
[2] W. Heisenberg, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik". Z. Phys. 43, 172–198 (1927).
[3] M. Ozawa, "Universally valid reformulation of the Heisenberg uncertainty principle on noise and disturbance in measurements". Physical Review A 67, 042105 (2003). Abstract.
[4] H. Rauch and S. A. Werner, "Neutron Interferometry", Clarendon Press, Oxford (2000).

Labels:


Sunday, January 15, 2012

Quantum Complementarity Meets Gravitational Redshift













(From left to right) Magdalena Zych, Fabio Costa, Igor Pikovski, Časlav Brukner


Authors: Magdalena Zych, Fabio Costa, Igor Pikovski, Časlav Brukner

Affiliations: Faculty of Physics, University of Vienna, Austria

Link to "Quantum Foundations and Quantum Information Theory" Group >>

The unification of quantum mechanics and Einstein's general relativity is one of the most exciting and still open questions in modern physics. In general relativity, space and time are combined into a unified underlying geometry, which explains the gravitational attraction of massive bodies. Typical predictions of this theory become clearly evident on a cosmic scale of stars and galaxies. Quantum mechanics, on the other hand, was developed to describe phenomena at small scales, such as single particles and atoms. Both theories have been confirmed by many experiments independently. However, it is still very hard to test the interplay between quantum mechanics and general relativity. When considering very small systems, gravity is typically too weak to be of any significance. The most precise experiments so far have only been able to probe the non-relativistic, Newtonian limit of gravity in conjunction with quantum mechanics. Conversely, quantum effects are generally not visible in large objects.

According to general relativity, time flows differently at different positions due to the distortion of space-time by a nearby massive object. A single clock being in a superposition of two locations allows probing quantum interference effects in combination with general relativity. [Image credits: Quantum Optics, Quantum Nanophysics, Quantum Information; University of Vienna]

There is, however, a possibility to measure predictions of Einstein’s theory of general relativity without using extremely massive probe particles: one of the counterintuitive predictions of Einstein's general relativity is that gravity distorts the flow of time. The theory predicts that clocks tick slower near a massive body and tick faster the further they are away from the mass. The earth’s gravitational field produces a sufficient distortion of space-time such that the different flow of time at different altitudes can be measured with very precise clocks. This has been confirmed experimentally with classical clocks and the results were in full agreement with Einstein’s theory.

Two initially synchronized clocks placed at different gravitational potentials will eventually show different times. According to general relativity a clock near a massive body ticks slower than the clock further away from the mass. This effect is known as gravitational time dilation or gravitational redshift.

Scientists at the University of Vienna now proposed that the effect described above, which is also commonly known as the “gravitational redshift”, can also be used to probe the overlap of general relativity with quantum mechanics. In the scheme published in October in Nature Communications, the classical version of the experiment is modified such that it becomes necessary to take quantum mechanics into account. The idea is to exploit the extraordinary possibility that a single particle can be without a well-defined position, or as phrased in quantum mechanical terms: it can be in a “superposition” of two different locations. This allows single particles to produce typical wave-like detection patterns, i.e. interference.

Superpositions of particles are, however, very fragile: if the position of the particle is measured, or even if it can in principle be known, the superposition is lost. In other words, it is not possible to know the position of the particle and to simultaneously observe interference. Such a connection between information and interference is an example of quantum complementarity - a principle originally proposed by Niels Bohr. Because of the above-mentioned fragility, it is very challenging to observe and to maintain superpositions of particles. Even a very weak interaction of the particle with its surrounding leads to the demise of quantum interference. But even though the loss of superpositions is a nuisance in many quantum experiments, the newly proposed experimental scheme to probe general relativity in conjunction with quantum mechanics actually builds upon this complementarity principle.

The novel idea developed in the group of Prof. Č. Brukner is to use a single clock (which can be any particle with evolving internal degrees of freedom, such as spin) that is brought in a superposition of two locations – one closer and one further away from the surface of the Earth. Afterwards, the two parts of the superposition are brought back together, and it is observed whether or not an interference pattern is produced. According to general relativity, the clock ticks at a different rate depending on its location. But since the time measured by the clock reveals the information on where the clock was located, the interference and the wave-nature of the clock should be lost. The amount of the loss of quantum mechanical interference becomes a measure of the general relativistic redshift. To describe this effect, both general relativity and quantum mechanics are required. Such an interplay between the two theories has never been probed in experiments yet. It is therefore the first proposal for an experiment that allows testing the genuine general relativistic notion of time in conjunction with quantum complementarity.

A single clock is brought in a quantum superposition of two locations: closer and further away from the surface of the Earth. Because of the gravitational redshift, the time shown by the clock reveals the information on the clock’s location. Thus, according to the quantum complementarity principle, interference and the quantum wave-nature of the clock will be lost.

In the setup described above, the loss of quantum interference becomes a tool to measure the general relativistic time dilation. It is not even necessary to read out the clock itself: The sheer existence of the clock is sufficient to destroy the interference. But since quantum interference effects are very fragile, it is important to verify that their demise is really caused by the distortion of the flow of time. This can be done by performing the same experiment in two different ways: one where the clock is running, as described above, and one where the clock is “switched off”. In the latter case the quantum interference should become visible, as opposed to the former case.

A further application of the proposed experiment is that it can also test new physical theories. For example, in the context of theories that aim at combining general relativity and quantum mechanics into a single framework, it has been proposed that every particle carries a clock with itself, which measures time along its path. Such a possibility can be probed by the proposed experiment, without the need to directly measure such a hypothetical internal clock: if quantum interference is lost even in the case when the clock which is controlled by the experimentalist (for example, the aforementioned precession of the particle’s spin) is switched off, one can infer that there is an intrinsic mechanism which can keep track of time by itself. On the other hand, if interference is observed, the existence of an internal clock can be ruled out.

Another interesting possibility is that the quantum interference persists even with the experimentally controlled clock turned on. This would mean that quantum mechanics or general relativity breaks down when phenomena inherent to both theories become relevant. Such a scale has never been accessible for experimental tests so far.

To experimentally observe the predicted interplay of quantum interference and the gravitational redshift, three parameters are of importance: The height difference of the two locations at which the particle is held in a superposition, the time that the particle is kept in the superposition and the ticking rate of the clock. The larger any of those values, the easier it is to observe the effect. Currently, the most promising systems for such an experiment are single atoms. They can be brought into superpositions in atomic fountains and their internal states can be used as atomic clocks. There are also other systems that can be used to successfully perform the experiment: neutrons, electrons and even large molecules. There has been rapid experimental progress in the precision of clocks and in the size of the superpositions that can be created and maintained in the laboratory. It is therefore possible that within the next few years the proposed experiment with quantum clocks can be realized.

Both quantum mechanics and general relativity seem to be universal theories, though we still don’t know how to properly combine them in a universal framework. New phenomena are expected at some scale at the interplay between the two theories. Only experimentally probing this interplay may give a hint as to how to proceed in constructing a unifying description of nature.

Reference
[1] Magdalena Zych, Fabio Costa, Igor Pikovski & Časlav Brukner. "Quantum interferometric visibility as a witness of general relativistic proper time". Nature Communications, 2:505 doi: 10.1038/ncomms1498 (2011). Full Article: PDF, HTML.

Labels: ,


Sunday, December 25, 2011

Entang-bling

Ian Walmsley (left), Joshua Nunn (right) contempl -ating the universe.










Authors: Ian Walmsley and Joshua Nunn

Affiliation:
Clarendon Laboratory, Department of Physics, University of Oxford, UK.


More than 70 years ago, Erwin Schrödinger pointed out one (of many) striking features of the quantum mechanics that he’d recently invented: the possibility it allowed for stuff to do things that no one had actually seen in real life -- like cats being both dead and alive at the same time. This is one of what could be politely called the ‘interpretational difficulties’ of quantum physics. Familiar everyday objects behave in familiar everyday ways - they don’t engage in the sorts of nonsensical behaviour that Schrödinger’s equation predicts. But, as any physicist will tell you, basically quantum mechanics is not that complicated. It’s just that it takes familiar concepts, like position and direction, and makes us think about it in totally radical ways, so that in the end the results don’t make any sense at all!

Past 2Physics article by Joshua Nunn:
August 07, 2011: "Building a Quantum Internet"


Michael Sprague (top), KC Lee (left) and XianMin Jin (right), in the lab with the diamonds.

Of course, in the early 20th century, people were used to the idea that science was coming up with crazy new notions that dramatically altered our conception of things — like the notion of time in Einstein’s special theory of relativity. But Einstein’s theories deal with objects that can be seen on size and timescales that are familiar. The light from stars can be observed with a simple telescope; the timing of GPS satellites is a tangible technical problem. For this reason relativity has entered the scientific orthodoxy, which is why the recent neutrino speeding anomaly has caused such stir [1].

That’s what worried Schrödinger. In principle his theory also dealt with tangible objects — or at least there was no element in it that indicated otherwise. Yet it seemed at first as if quantum mechanics only gave good predictions for objects that are too small to be seen directly. It therefore took on the flavour of a story, in which the actors — electrons, atoms and photons — are convenient fictions we can use to explain what we see, but which are no more real than the characters in a novel.

Quantum theory tells us that in fact these characters can be in two places at once, that they are impossible to pin down exactly, and that they don’t really therefore give well defined answers to questions like ”are you red or blue”? We’re used to the idea in the ordinary world that an object, say a ball, will have a definite property like color. We may not know if a specific ball is red or blue, but we may regard it as having one of these two colors independent of whether we know which particular color it is. Quantum mechanics says, well, no, it’s not possible to be sure of that. In the quantum world, balls can be both red and blue at the same time.

But it is not the particles acting on their own that give rise to the deepest mysteries — it is when they get together that the fun really starts. For instance, it becomes possible to say that the color of one ball is well defined only in relation to a second ball. So if one is red, then the other is certainly blue -- but that neither is definite red or blue on its own. And you can actually test this proposition with microscopic particles — like photons (“particles” of light). This is the murky world of ”entanglement” in which pairs of particles are apparently connected across the universe as if by invisible filaments.

You can think about this in terms of what you know about light. Consider a beamsplitter. This is a common optical device: essentially a half-silvered mirror which passes half the incident light, and reflects the other half, as shown in Fig. 1. When a single photon encounters a beamsplitter it cannot split itself in two, so it must go one way or the other. Or does it? According to quantum physics it can go both one way and the other. In fact, the beamsplitter transforms the single photon into an entangled state[2]. If we measure if the photon is passed or reflected, we get that each option occurs 50% of the time. So this measurement alone does not help us distinguish between the entangled state and a state in which the input photon definitely goes one way or the other at random.

But consider the time reversal of this situation. Now we put a single photon into the back side of the beamsplitter. It also could be reflected or passed with 50% probability, except if the photon is in an entangled state of the two input ports. Then, by reversing argument above, we can see that it is definitely passed through the beamsplitter. Thus, by looking at how often a single input photon is passed by the beamsplitter, we can tell whether or not it was in an entangled state at the input. This kind of magic realism makes physicists (or at least, philosophers of physics) uncomfortable, but the edifice of science survives with such strangeness at its core because quantum effects are confined to the abstract domain of the microscopic, where human experience has no purchase and there can be no direct conflict with our intuitions.

Figure 1: A single photon (filled circle) cannot divide into two when it hits a beam splitter. It must either pass through, or be reflected. According to quantum mechanics, both of these possibilities occur, producing an entangled state, in which a single photon is shared between the two beams after the beam splitter. Running this process in reverse (i.e. from right to left) provides a way to detect entanglement, since only an entangled state will always produce a single photon in the same place on the left.

There is now a strong tradition of research which seeks to bring us face-to-face with our Frankenstein theory by confirming the predictions of quantum mechanics on human scales. The aim is to demonstrate quantum effects such as entanglement with increasingly large objects, containing more and more particles. Although many areas of physics have matured sufficiently that the underlying components of the theory are ‘accepted’, it is known that quantum mechanics as is cannot be the final word, since its predictions conflict with our experience of the world. Either some new physics is needed, or better arguments to explain how to reconcile quantum theory with the rest of the world.

While the philosophical debates smoulder on [3,4], experimentalists have set themselves the task of identifying the conditions under which quantum effects survive into the human realm, leading to behaviour we are not used to seeing in the familiar world of cats and elephants. Considerable progress has been made: large numbers of atoms have been entangled [5,6], and small pieces of solid material — but big enough to see with the naked eye — have been put into quantum superposition states where they were both vibrating and not vibrating at the same time [7].

These breakthroughs showed that quantum effects don’t need to be confined to small numbers of particles, or to particles without mass like photons. But so far, extremely specialised laboratory conditions have been required to observe these effects: very low temperatures just above absolute zero and high vacuum, with no air and no extraneous electric or magnetic fields. The objects were highly delicate composite devices which would not be found in nature, and careful preparation was required in order to keep them isolated from the deleterious effects of the environment.

We recognised that some materials have properties, like vibrations, that naturally lend themselves to realising these conditions in an everyday laboratory setting. These vibrations require a lot of energy to get going, so that ordinary environments at regular temperatures do not excite them. They are by nature in relatively pure quantum states, with no vibrational excitation at all. They may, however, be strongly coupled to their environment in the sense that once excited they quickly decay, so that quantum effects might be present only if we could be quick enough to observe them before they became overwhelmed.

We therefore decided to carry out an ‘easy experiment’ (though only Oxford physicists could be silly enough to think that any experiment is easy); to set one of these vibrations going using a very short duration light pulse from a laser, then to ”watch” it by means of a second short laser pulse acting as a probe of the vibrational motion. We realised that diamond was a naturally-occurring transparent material that was so hard that it could vibrate at a particular, very high-pitched frequency, which could be easily identified in a measurement. The vibrations in diamond last for just 7 picoseconds (1 ps is one thousandth of a nanosecond), so we had to use an ultrafast laser system producing laser pulses shorter than 100 femtoseconds (1 fs is one thousandth of a picosecond!).

We took an ordinary, common-or-garden diamond and set it vibrating using
a laser pulse. When the laser pulse hits the diamond, there is a small probability that just one photon from the laser pulse gives up some energy to the diamond to set it vibrating. By conservation of energy, this must mean that the photon leaves the diamond with reduced energy, and thus a longer wavelength than the original laser photon. By detecting this “red” photon, we could know that a single vibrational quantum (known technically as an optical phonon) had been created in the diamond crystal. We found that even at room temperature and pressure, in a lab with air and other vibrations and cups of tea, we could create this high-frequency vibration.

Now, we could prove this by detecting it using a second laser pulse, arriving after the first, but not so long after that the vibration had decayed away. The probe pulse detected the vibration by picking up energy from it, emerging with a shorter wavelength, so blue-shifted in color. So we detected a “red” photon to signal that a phonon has been generated, and a “blue” photon to prove that. Using this approach, we showed that we could catch a glimpse of the phonon before it vanished [8]. This type of “create-detect” experiment is precisely what has been done with cold clouds of atoms to entangle them, so we thought we would try to do that!





















Figure 2: The happy couple. We took data with the lights off but otherwise the diamonds were in a totally ordinary environment. The lenses are there to focus the laser pulses and collect the photons emitted by the crystals.

In a second experiment [9], we set up two diamonds, in ordinary little holders sitting near each other on a lab table (see Fig.2). By hitting both diamonds with a laser pulse at the same time, we created a vibration in one of the crystals, but it was impossible, even in principle, to tell which crystal was vibrating. We did this by combining the beams from the two diamonds that went to the “red” photon detector on a beamsplitter, as shown in Fig.3. When this detector fired, we knew that a single phonon has been generated, but we could not tell from which beam the red photon had arrived, and therefore in which diamond the phonon resided.

Quantum mechanics predicts that, if you don’t know this information, the right way to describe the diamonds is as an entangled quantum state, with one vibration shared between them. We then verified that the diamonds were entangled by combining the “blue” light from the diamonds at a beamsplitter (see Fig.3). We could detect first that each pulse only contained a single “blue” photon, and second that it was always passed by the beamsplitter, rather than reflected. This is only possible for a single photon if it is entering the beamsplitter in an entangled state, as argued previously, and thus was emitted from both diamonds! This means that the diamonds themselves were entangled, with a single vibration shared between both of them.

These results show, for the first time, that large, easily visible, solid objects (indeed, diamonds are naturally occurring minerals: pieces of rock), sitting in ambient conditions at room temperature and pressure, clamped to a table-top, can be put in a quantum-entangled state. Furthermore, the entanglement was created with vibrations — the motion of the crystals as a whole.

Figure 3: Generating and detecting entanglement between diamonds using ultrashort laser pulses (green lines). (a) The first set of pulses produces a single red-shifted photon from one of the diamonds. After the beams are mixed on a beam splitter, it is impossible to tell which diamond the photon came from. This means there is one vibration shared between the two diamonds — they are entangled. (b) After a small delay, we verify the entanglement by sending in a second pair of pulses, producing a blue-shifted photon in an entangled state. When this entangled state hits the beam splitter, the blue photon always emerges from just one side of the experiment (thick blue line). As shown in Fig.1, this can only happen if the diamonds are entangled.

So the positions of the atoms were entangled. This is particularly unsettling because we have an intuitive sense for position that we would not have if we had entangled magnetic fields or photons. Our measurements are, we feel, one of the most visceral demonstrations to date that the rules of quantum mechanics apply to us all: electrons and elephants alike.

References
[1] Opera Collaboraton, "Measurement of the neutrino velocity with the opera detector in the CNGS beam". arXiv:1109.4897 (2011). Link.
[2] S.J. van Enk. "Single-particle entanglement". Physical Review A, 72(6):064306 (2005). Abstract.
[3] D. Wallace. "Decoherence and Ontology: or, How I Learned to Stop Worrying and Love FAPP", in "Many Worlds? Everett, Quantum Theory, and Reality", eds. S. Saunders, J. Barrett, A. Kent and D. Wallace (OUP, 2010). Link.
[4] R. Penrose. "Wavefunction collapse as a real gravitational effect". Mathematical Physics 2000 (edited by A Fokas, A Grigoryan, T Kibble, B Zegarlinski), pages 266–282, (World Scientific eBooks, 2000). Link.
[5] K. S. Choi, H. Deng, J. Laurat, and H. J. Kimble. "Mapping photonic entanglement into and out of a quantum memory". Nature, 452:67–71 (2008). Abstract.
[6] H. Krauter, C.A. Muschik, K. Jensen, W. Wasilewski, J.M. Petersen, J.I. Cirac, and E.S. Polzik. "Entanglement generated by dissipation and steady state entanglement of two macroscopic objects". Physical Review Letters, 107(8):80503 (2011). Abstract.
[7] Adrian Cho. Faintest thrum heralds quantum machines. Science, 327(5965):516–518 (2010). Abstract.
[8] K. C. Lee, B. J. Sussman, M. R. Sprague, P. Michelberger, K. F. Reim, J. Nunn, N. K. Langford, P. J. Bustard, D. Jaksch, and I. A. Walmsley. "Macroscopic nonclassical states and terahertz quantum processing in room- temperature diamond", Nature Photonics, 6, 41-44 (2011). Abstract.
[9] KC Lee, MR Sprague, BJ Sussman, J. Nunn, NK Langford, X.M. Jin, T. Champion, P. Michelberger, KF Reim, D. England, D. Jaksch, I. A. Walmsley. "Entangling macroscopic diamonds at room temperature". Science, 334(6060):1253–1256 (2011). Abstract.

Labels: