.comment-link {margin-left:.6em;}

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, November 13, 2011

A New Scheme for Photonic Quantum Computing














[From Left to Right] Nathan K. Langford, Sven Ramelow and Robert Prevedel


Authors: Nathan K. Langford, Sven Ramelow and Robert Prevedel

Affiliation: Institute for Quantum Optics and Quantum Information (IQOQI), Austria;
Vienna Center for Quantum Science and Technology, Faculty of Physics, University of Vienna, Austria

Quantum computing is a fascinating and exciting example of how future technologies might exploit the laws of quantum physics [1]. Unlike a normal computer (“classical” computer), which stores information in 0s and 1s (called “bits”), a quantum computer stores information in quantum bits (“qubits”), states of quantum systems like atoms or photons. In principle, a quantum computer can solve the exact same problems as classical computers, so why do we think they could be so fantastic? It all comes down to speed – that is, in the context of computing, how many elementary computational steps are required to find an answer.

Past 2Physics articles by Robert Prevedel:
October 23, 2011: "Heisenberg’s Uncertainty Principle Revisited"
by Robert Prevedel
June 08, 2007: "Entanglement and One-Way Quantum Computing"
by Robert Prevedel and Anton Zeilinger


For many different types of problems, classical computers are already fast – meaning that reasonable problems can be solved in a reasonable time and the time required for a “larger” problem increases only slowly with the size of the problem (known as “scaling”). For example, once you know how to add 17 and 34, it’s not that much more difficult to add 1476 and 4238. For such problems, quantum computers can’t really do any better. Some types of problems, however, can be solved much faster on a quantum computer than on a classical computer. In fact, quantum computers can actually perform some tasks that are utterly impossible for any conceivable classical computer. The most famous example is Shor’s algorithm for finding the prime factors of a large integer [2], a problem which lies at the heart of many important computing tasks. It’s straightforward to work out that the factors of 21 are 3 and 7, but it’s already much harder to work out that the factors of 4897 are 59 and 83, and modern data protection (RSA encryption) relies on this problem becoming effectively impossible on a classical computer for really big numbers (say, 50 or 100 decimal digits long). But that would not be true for a quantum computer. It turns out that quantum computers could achieve an enormous speed-up, because of the unique quantum features of superposition and entanglement.

Shor’s algorithm is a great example of the revolutionary potential for technologies built on quantum physics. The problem with such technologies, however, is that quantum systems are incredibly hard to control reliably. Classical computers are an astonishingly advanced technology: classical information can be stored almost indefinitely and the elementary computational gates which manipulate the information work every time. “It just works.” [3] By contrast, quantum information is incredibly fragile – you can destroy it literally by looking at it the wrong way! This places extremely stringent demands on what is required to control it and make it useable. In 1998, David DiVincenzo outlined a set of minimum sufficient criteria required to build a scaleable quantum computer [4] and since then experimentalists from all corners of physics have been working to fulfil them.

One of the most promising architectures for quantum information processing (QIP) and in particular quantum computing is to encode information in single photons. Because they generally interact very weakly with their environment, provided they are not absorbed accidentally, they can be used to store and transmit information without it being messed up. But this strength also creates its own problems, which arise when you want to create, manipulate or measure this information. Because a single photon doesn't interact much with atoms or other photons, it is very hard to do these things efficiently. And efficiency is the key to the whole idea of quantum computing, because the enormous quantum speed-up can only be achieved if the basic building blocks work efficiently. This is the biggest challenge for photonic QIP: current schemes for preparing single photons are inefficient and linear-optics gates are inherently probabilistic [5]. For example, pioneering work by Knill, Laflamme and Milburn showed how to overcome these problems in principle [6], but at an enormous cost in physical resources (gates, photons, etc.) which makes their approach almost completely infeasible in practice. The main goal of our approach is to make photons talk to each other efficiently.

In a recent paper [7], we introduce a new approach to photonic QIP – coherent photon conversion (CPC) – which is based on an enhanced form of nonlinear four-wave mixing and fulfils all of the DiVincenzo criteria. In photonic QIP experiments, nonlinear materials are commonly used to provide probabilistic sources of single-photon states. By shining in a strong laser beam, the nonlinear interaction will very occasionally cause a laser photon (usually around one in a billion) to split into two photons, making a very inefficient source of heralded photons. Instead, we looked at what would happen if we used a single photon instead of a strong laser beam. Surprisingly, we found that, if you can make the interaction strong enough, it should be possible to make the efficiency of photon splitting rise to 100% – something that is impossible with a laser input. In fact, we found that the same type of interaction can be used to provide a whole range of “deterministic” tools (tools that work with 100% efficiency), including entangling multiphoton gates, heralded multiphoton sources and efficient detectors – the basic building blocks required for scaleable quantum computing. Some of these are shown and briefly described in Fig. 1.




















Figure 1: Fulfilling the DiVincenzo criteria with CPC. (a) A deterministic photon-photon interaction (controlled-phase gate) based on a novel geometric phase effect. (b) A scaleable element for deterministic photon doubling, which can be used in a photon-doubling cascade (c) to create arbitrarily large multiphoton states.

Perhaps the most remarkable thing about CPC is that it should be possible to build a single all-optical device, with four I/O ports, which can provide all of these building blocks just by varying what type of light is sent into each port. And because of the underlying four-wave mixing nonlinearity, it could be compatible with current telecommunications technology and perhaps even be built entirely on a single photonic chip. This could make it much easier to build more complex networks.

LinkFigure 2: Nonlinear photonic crystal fibre pumped by strong laser pulses (7.5 ps at 532 nm) to create the nonlinearity required for CPC.

To demonstrate the feasibility of our proposed approach, we performed a first series of experiments using off-the-shelf photonic crystal fibres (see Fig. 2) to demonstrate the nonlinear process underlying the CPC scheme¬. The next step required is to identify what can be done to optimise the nonlinear coupling, both by improving the materials and engineering a better design. While deterministic operation has yet to be achieved, our results show that this should be feasible with sophisticated current technology, such as with chalcogenide glasses, which are highly nonlinear and can be used to make both optical fibres and chip-based integrated waveguides [8].

Finally, we hope that CPC will be a useful technique for implementing coherent, deterministic multiphoton operations both for applications in quantum-enhanced technologies and for fundamental tests involving entanglement and large-scale quantum systems. Interestingly, the general idea of “coherent photon conversion” can also be implemented in physical systems other than photons, such as in optomechanical, electromechanical and superconducting systems where the intrinsic nonlinearities available are even stronger.

References
[1] R.P. Feynman, "Simulating Physics with Computers". International Journal of Theoretical Physics, 21, 467–488 (1982). Article(PDF).
[2] P. W. Shor. "Algorithms for quantum computation: Discrete logarithms and factoring" (In Proceedings of the 35th Annual Symposium on Foundations of Computer Science, page 124, Los Alamitos, 1994. IEEE Computer Society Press). Abstract.
[3] Steve Jobs (2011). YouTube Video.
[4] D.P. DiVincenzo, D. Loss, "Quantum information is physical". Superlattices and Microstructures, 23, 419–432 (1998). Abstract.
arXiv:cond-mat/9710259.
[5] P. Kok, W. J. Munro, K. Nemoto, T.C. Ralph, J.P. Dowling, G.J. Milburn, "Linear optical quantum computing with photonic qubits". Review of Modern Physics, 79, 135–174 (2007). Abstract.
[6] E. Knill, R. Laflamme, G.J. Milburn, "A scheme for efficient quantum computation with linear optics". Nature, 409, 46–52 (2001). Abstract.
[7] N. K. Langford, S. Ramelow, R. Prevedel, W. J. Munro, G. J. Milburn & A. Zeilinger. "Efficient quantum computing using coherent photon conversion". Nature 478, 360-363 (2011). Abstract.
[8] B.J. Eggleton, B. Luther-Davies, K. Richardson, "Chalcogenide photonics". Nature Photonics, 5, 141–148 (2011).
Abstract

Labels:


0 Comments:

Post a Comment