.comment-link {margin-left:.6em;}

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, May 27, 2012

Free Randomness Can Be Amplified

Author: Roger Colbeck

Affiliation: Institute for Theoretical Physics, ETH Zurich, Switzerland

Are there fundamentally random processes in Nature? 150 years ago, scientists with a classical world-view would have likely answered in the negative: the laws of classical mechanics state that if one knew all the physical properties of every particle at some particular instant in time, then, in principle, the future evolution could be calculated. However, from the beginning of the 20th Century, this world-view started being challenged as quantum theory was born. This profoundly different theory asserts that the outcomes of measurements are fundamentally random. So, when a single photon is emitted from a source and sent through a half-silvered mirror, for example, all quantum theory tells us is that with probability 1/2 the photon is reflected, and with probability 1/2 it passes through, a distribution that can be confirmed statistically.

But is that the whole story? How can we be sure that the destiny of the photon (whether it will pass the mirror, or be reflected) wasn't already determined in such a way that observations on many photons nevertheless give the same statistics as if random?

In 1964, the work of John Bell [1] shed some light on the question of whether there could be higher explanations for the quantum statistics. By studying an extended experiment, involving two entangled photons sent towards two half-silvered mirrors, he showed that if the source determined the behaviour of the photons then the resulting correlations could not be those predicted by quantum mechanics. Experiments later confirmed the quantum predictions, e.g. [2].

It is tempting to use the above to argue for the existence of fundamentally random processes, but there is a catch. Bell's argument relies on different configurations of the half-silvered mirrors, and he assumes that these are chosen at random. Thus, for the purpose of arguing that there are truly random processes (note that this wasn't Bell's aim), the argument is circular. If we can randomly choose the configurations, then the outcomes are random, as was previously stressed by Conway and Kochen [3].

In our paper [4], we show that if we have access only to some weak randomness to choose the configurations, then the outcomes of certain quantum experiments are nevertheless completely random. To capture the idea of weak randomness, imagine that you write down a string of 0s and 1s choosing them as randomly as you can. However, before you write each bit, a sophisticated machine is asked to guess your next choice. If your choices are weakly random, then the machine can guess the next one with some probability greater than 1/2. What we show in our paper, is that provided the machine's guessing probability is not too high, it is possible to make random bits for which the machine knows nothing: this is randomness amplification.

[Image credit: T. Neupert]: Illustration of randomness amplification: A moving die about to strike an assembled tower. The tower falls yielding random numbers on several dice, thus amplifying the randomness of the original. In classical physics, the apparently random way the dice fall is in principle predictable, while, in quantum theory, there are ways to make this amplification fundamental.

It is interesting to note that this is not possible within classical mechanics. There, given a source of weak randomness, there is no protocol that can improve the quality of the randomness [5]. Thus, the task we present gives a new example of the improved power of using quantum systems over classical ones for information processing.

We conjecture that our result is extendible such that, provided the machine cannot guess the choices perfectly, it is possible to generate perfectly random bits. This would provided the strongest possible evidence in the existence of random processes in Nature: it would show that either the world is completely deterministic, or there are perfectly random processes.

This work also has applications in virtually any scenario that relies on randomness. For example, a casino that doesn't completely trust its random number generators could in principle use a protocol of the type we suggest to improve the quality of the randomness.

References:
[1] Bell, J. "On the Einstein Podolsky Rosen Paradox". Physics, 1, 195--200 (1964). Full Article.
[2] Aspect, A., Grangier, P. & Roger, G. "Experimental Realization of Einstein-Rosen-Rosen-Bohm Gedankenexperiment: A New Violation of Bell's Inequalities". Physical Review Letters 49, 91--94 (1982). Abstract.
[3] Conway, J. & Kochen, S. "The free will theorem", Foundations of Physics, 36, 1441--1473 (2006).Abstract.
[4] Colbeck, R. & Renner, R. "Free randomness can be amplified", Nature Physics,  doi:10.1038/nphys2300, (Published online May 6, 2012). Abstract.
[5] Santha, M. & Vazirani, U. V.  "Generating Quasi-Random Sequences From Slightly-Random Sources",  in Proceedings of the 25th IEEE Symposium on Foundations of Computer Science (FOCS-84) 434--440 (1984). Abstract.

Labels: ,


0 Comments:

Post a Comment