.comment-link {margin-left:.6em;}

2Physics

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Saturday, May 30, 2009

Transmission of Entangled Photons
over a High-Loss Free-Space Channel



Rupert Ursin



[This is an invited article based on a recently published work of the authors and their collaborators -- 2Physics.com]







Authors: Alessandro Fedrizzi1, Rupert Ursin1 and Anton Zeilinger1,2,
Affiliation:
1
Institute for Quantum Optics and Quantum Information (IQOQI), Austria,
2
Faculty of Physics, University of Vienna, Austria

Entanglement is an essential phenomenon of quantum mechanics. Two entangled particles, photons for example, will individually yield random results upon being measured, but these results will always be perfectly correlated, no matter how far the two particles are separated from each other. Entanglement has been proven to be at the heart of a wide range of fundamental quantum effects and it drives exciting practical applications, such as quantum cryptography, quantum teleportation and quantum computing [1], which would be impossible in a world limited to classical physics. A team of researchers from the Institute for Quantum Optics and Quantum Information (IQOQI) in Vienna and the University of Vienna, led by Anton Zeilinger, has now reported the successful transmission of entangled photon pairs between two Canary Islands, bridging a distance of 144 km and a two-photon attenuation of almost ten million to one. The result, published in Nature Physics [2], is so far the most convincing demonstration to perform experiments with entangled photons in space.

More information on this endeavor can be found at http://www.quantum.at .

Past 2Physics articles by members of this collaboration:
"The Frontier of Quantum Communication is the Space" -- Paolo Villoresi,
"Entanglement and One-Way Quantum Computing" -- Robert Prevedel and Anton Zeilinger


Image 2: Canary islands (Google Map). Entangled photons were sent from La Palma to Tenerife (distance 144 km)

In their experiment, the researchers exploited a new design of a high-intensity source of entangled photons, which allowed photon-pair production rates of about one million per second [3]. This photon source was located at the Island of La Palma, one of the Canary Islands (see image 2). From there, both photons of entangled pairs were transmitted through a home-built twin telescope (image 3) and sent to the 144 km distant island of Tenerife, where they were collected by the Optical Ground Station (OGS, image 4), a research observatory operated by the European Space Agency. This unique location was chosen because of the excellent experimental conditions for free-space experiments; clean air, an unobstructed view over a very long distance and the availability of a high-tech, large aperture telescope, which could be used as a receiver.

Once the primary mirror of the OGS collected the photon pairs, the photons were guided to an experimental chamber by a series of mirrors then split and individually analyzed and detected. The trickiest part in free-space experiments with single photons is certainly to find the individual photons in the background light. Fortunately, entangled photons are produced at exactly the same time. If therefore two detectors click at exactly the same time (in practice within a time window of 1 nanosecond, one billionth of a second), one can be quite sure that the clicks were really produced by two photons of an entangled pair. A series of polarization correlation measurements made it possible to show that the received photon pairs were still just as highly entangled as when they were produced by the source, with the entanglement quality limited only by background noise. This is astounding as the photons were experiencing a rough ride during their flight time of ½ of a millisecond through the turbulent atmosphere, the longest recorded lifetime for entangled photons so far.

Image 3: Thomas Herbst, aligning the homebuilt transmitter telescope at La Palma.

Previous proof-of-principle experiments for quantum information in space by the same group and their international collaborators* include the transmission of only one photon of an entangled pair over the same free-space link [4], quantum key distribution with weak coherent pulses [5] and an experiment in which single photons were bounced off a retro-reflecting mirror mounted on a satellite orbiting at an altitude of about 6000 km [6].

The conditions in the new experiment were very close to those expected for a downlink from a satellite to two separate receiver stations on the ground. In particular, the high attenuation (107:1 for photon pairs) the photons were exposed to was similar to that expected in a space scenario. The atmospheric turbulence along the 144 km flight path was in fact much larger, because the atmosphere thins out rapidly at higher altitude and the optical density of the atmosphere along a vertical trajectory into space is equivalent to a merely 7 km long horizontal path through the atmosphere. Moreover, the researchers have shown that observatories like the OGS, which was originally built for classical laser communication and is perfectly suited to track a fast-moving object in orbit, can be adapted for quantum optics experiments.

Image 4: Receiver telescope in the optical ground station, Tenerife. Incoming photons are collected by this 1-meter mirror telescope and then guided to the analysis and detection apparatus.

Moving entanglement from ground-based laboratories into space will eventually enable experiments on a much larger distance scale than currently possible on ground. A low flying space vessel such as the International Space Station ISS, would be able to transmit photons to ground observers separated by more than 1000 km. This would benefit tests of the foundations of quantum mechanics as well as practical applications of entangled photons in space, such as long distance quantum cryptography, the secure distribution of keys, which can be used to encrypt messages [7]. Once this initial step into space has been mastered, experiments between two or more moving satellites will allow relativistic tests of quantum mechanics as well as experimental tests on entanglement in gravitational fields [8].

To eventually make this vision a reality, the group in Vienna and their international partners, which include universities, space industry and the European Space Agency, has already started working on a first demonstration prototype of an entangled-photon source that could be integrated into a space-borne terminal. The schedule is compatible with a launch into space within the next decade.

This work was supported by the Austrian Research Promotion Agency (FFG) and the European Space Agency (ESA).

* In collaboration with LMU Munich and MPQ Garching, Germany, the University of Bristol, UK, the University of Padova, Italy and the European Space Agency
.

References:
[1]
“The Physics of Quantum Information: Quantum Cryptography, Quantum Teleportation, Quantum Computation”, D. Bouwmeester, A. Ekert and A. Zeilinger (Springer, Berlin, 2001).
[2] “High-fidelity transmission of entangled photon pairs over a high-loss free-space channel”, A. Fedrizzi, R. Ursin, T. Herbst, M. Nespoli, R. Prevedel, T. Scheidl, F. Tiefenbacher, T. Jennewein and A. Zeilinger, Nature Physics, doi:10.10138/NPHYS1255 (2009).
Abstract.
[3] “A wavelength-tunable fibre-coupled source of narrowband entangled photons”, A. Fedrizzi, T. Herbst, A. Poppe, T. Jennewein, T. and A. Zeilinger, Optics Express, 15, 15377–15386 (2007).
Abstract.
[4] “Entanglement-based quantum communication over 144 km”, R. Ursin, F. Tiefenbacher, T. Schmitt-Manderbach, H. Weier, T. Scheidl, M. Lindentha, B. Blauensteiner, T. Jennewein, J. Perdigues, P. Trojek, B. Ömer, M. Fürst, M. Meyenburg, J. Rarity, Z. Sodnik, C. Barbieri, H. Weinfurter, A. Zeilinger, Nature Physics, 3, 481–486 (2007).
Abstract
[5] “Experimental demonstration of free-space decoy-state quantum key distribution over 144 km”, T. Schmitt-Manderbach, H. Weier, M. Fürst, R. Ursin, F. Tiefenbacher, T. Scheidl, J. Perdigues, Zoran Sodnik, C. Kurtsiefer, J. G. Rarity, A. Zeilinger, H. Weinfurter, Phys. Rev. Lett. 98, 10504 (2007).
Abstract.
[6] "Experimental verification of the feasibility of a quantum channel between space and Earth", P Villoresi, T Jennewein, F Tamburini, M Aspelmeyer, C Bonato, R Ursin, C Pernechele, V Luceri, G Bianco, A Zeilinger and C Barbieri, New J. Phys., v10, 033038 (2008).
Abstract. 2Physics Article.
[7] “Space-QUEST: Experiments with quantum entanglement in space”, R. Ursin, T. Jennewein, J. Kofler, J. M. Perdigues, L. Cacciapuoti, C. J. de Matos, M. Aspelmeyer, A. Valencia, T. Scheidl, A. Fedrizzi, A. Acin, C. Barbieri, G. Bianco, C. Brukner, J. Capmany, S. Cova, D. Giggenbach, W. Leeb, R. H. Hadfield, R. Laflamme, N. Lutkenhaus, G. Milburn, M. Peev, T. Ralph, J. Rarity, R. Renner, E. Samain, N. Solomos, W. Tittel, J. P. Torres, M. Toyoshima, A. Ortigosa-Blanch, V. Pruneri, P. Villoresi, I. Walmsley, G. Weihs, H. Weinfurter, M. Zukowski, A. Zeilinger, arXiv:0806.0945.

[8] “Quantum connectivity of space-time and gravitationally induced de-correlation of entanglement”, T.C. Ralph, G. J. Milburn and T. Downes, Phys. Rev. A, 79, 022121 (2009). Abstract.

Labels:


Saturday, May 23, 2009

The Shadows of Gravity

Jose A. R. Cembranos

[This is an invited article based on the author's recently published work -- 2Physics.com]

Author: Jose A. R. Cembranos
Affiliation:
William I. Fine Theoretical Physics Institute, University of Minnesota in Minneapolis, USA

Many authors have tried to explain the dark sectors of the cosmological model as modifications of Einstein’s gravity (EG). Dark Matter (DM) and Dark Energy (DE) are the main sources of the cosmological evolution at late times. They dominate the dynamics of the Universe at low densities or low curvatures. Therefore it is reasonable to expect that an infrared (IR) modification of EG can lead to a possible solution of these puzzles. However, it is in the opposite limit, at high energies (HE), where EG needs corrections from a quantum approach. These natural ultraviolet (UV) modifications of gravity are usually thought to be related to inflation or to the Big Bang singularity. In a recent work, I have shown that DM can be explained with HE modifications of EG. I have used an explicit model: R2 gravity and study its possible experimental signatures [1].

Einstein’s General Relativity describes the classical gravitational interaction in a very successful way by the metric tensor of the space-time through the Einstein-Hilbert action (EHA). This theory is particularly beautiful and the action particularly simple, since it contains only one term proportional to the scalar curvature. The proportionality parameter which multiplies this term, defines the Newton’s constant of gravitation and the typical scale of gravity. This magnitude is known as the Planck scale and its approximated energy value is 1019 Giga-electronvolts, which is equivalent to a distance of 10-35 meters.

However, the inconsistency of quantum computations within the gravitational theory described by the EHA demands its modification at HE. Quantum radiative corrections produced by standard matter provide divergent terms that are constant, linear, and quadratic in the Riemann curvature tensor of the space-time. The constant divergence can be regularized by the renormalization of the cosmological constant, which may explain the Dark Energy. The linear term is absorbed in the renormalization of the Planck scale itself. On the contrary, the quadratic terms are not included in the standard gravitational action. If these quantum corrections are not cancelled by invoking new symmetries, these terms need to be taken into account for the study of gravity at HE [2]. Indeed, these terms are also produced by radiative corrections coming from the own EG. Unfortunately, the gravitational corrections do not stop at this order as the associated with the matter content. There are cubic terms, quartic terms, etc. All these local quantum corrections are divergent and the fact that there is a non finite number of them implies that the theory is non-renormalizable. We know how to deal with gravity as an effective field theory, working order by order, but we cannot access higher energies than the Planck scale by using this effective approach [2]. In any case, the Planck scale is very high, and unreachable experimentally so far.

Inspired by this effective field theory point of view, which identifies higher energy corrections with higher curvature terms, I have studied the viability of a solution to the missing matter problem from the UV completion of gravity. As I have explained above, the first HE modification to EG is provided by the inclusion of quadratic terms in the curvature of the space-time geometry. The most general quadratic action supports, in addition to the usual massless spin-two graviton, a massive spin-two and a massive scalar mode, with a total of eight degrees of freedom (in the physical gauge [3]). In fact, this gravitational theory is renormalizable [3]. However, the massive spin-two gravitons are ghost-like particles that generate new unitarity violations, breaking of causality, and important instabilities.

In any case, there is a non-trivial quadratic extension of EG that is free of ghosts and phenomenologically viable. It is the so called R2 gravity since it is defined by the only addition of a term proportional to the square of the scalar curvature to the EHA. This term by itself does not improve the UV behaviour of EG but illustrates the idea in a minimal way. This particular HE modification of EG introduces a new scalar graviton that can provide the solution to the DM problem.

In this model, the new scalar graviton has a well defined coupling to the standard matter content and it is possible to study its phenomenology and experimental signatures [1] [3][4]. Indeed, this DM candidate could be considered as a superweakly interacting massive particle (superWIMP [5]) since its interactions are gravitational, i.e. it couples universally to the energy-momentum tensor with Planck suppressed couplings. It means that the new scalar graviton mediates an attractive Yukawa force between two non-relativistic particles with strength similar to Newton’s gravity. Among other differences, this new component of the gravitational force has a finite range, shorter than 0.1 millimeters, since the new scalar graviton is massive.

This is the most constraining lower bound on the mass of the scalar mode and it is independent of any supposition about its abundance. On the contrary, depending on its contribution to the total amount of DM, its mass is constrained from above. I have shown that it cannot be much heavier than twice the mass of the electron. If that is the case, this graviton decays in an electron-positron pair. These positrons annihilate producing a flux of gamma rays that we should have observed. In fact, the SPI spectrometer on the INTEGRAL (International Gamma-ray Astrophysics Laboratory) satellite, has observed a flux of gamma rays coming from the galactic centre (GC), whose characteristics are fully consistent with electron-positron annihilation [6].

If the mass of the new graviton is tuned close to the electron-positron production threshold, this line could be the first observation of R2 gravity. The same gravitational DM can explain this observation with a less tuned mass and a lower abundance. For heavier masses, the gamma ray spectrum originated by inflight annihilation of the positrons with interstellar electrons is even more constraining than the 511 keV photons [7].

On the contrary, for lighter masses, the only decay channel that may be observable is in two photons. It is difficult to detect these gravitational decays in the isotropic diffuse photon background (iDPB) [8]. A most promising analysis is associated with the search of gamma-ray lines from localized sources, as the GC. The iDPB is continuum since it suffers the cosmological redshift, but the mono-energetic photons originated by local sources may give a clear signal of R2 gravity [1].

In conclusion, I have analyzed the possibility that the DM origin resides in UV modifications of gravity [1]. Although, strictly speaking, my results are particular of R2 gravity, I think they are qualitatively general with a minimum set of assumptions about the gravitational sector. In any case, different approaches to try to link our ignorance about gravitation with the dark sectors of standard cosmology can be taken [9], and it is a very interesting subject which surely deserves further investigations.

This work is supported in part by DOE Grant No. DOE/DE-FG02-94ER40823, FPA 2005-02327 project (DGICYT, Spain), and CAM/UCM 910309 project.

References

[1] J. A. R. Cembranos, ‘Dark Matter from R2 Gravity’ Phys. Rev. Lett. 102, 141301 (2009).
Abstract

[2] N. D. Birrell and P. C. W. Davies, 'Quantum Fields In Curved Space’ (Cambridge Univ. Pr, 1982); J. F.Donoghue, ‘General Relativity As An Effective Field Theory: The Leading Quantum Corrections’ Phys. Rev. D 50, 3874 (1994)
Abstract; A. Dobado, et al., ‘Effective lagrangians for the standard model’ (Springer-Verlag, 1997).

[3] K. S. Stelle, ‘Renormalization Of Higher Derivative Quantum Gravity’ Phys. Rev. D 16, 953 (1977)
Abstract; K.S. Stelle, ‘Classical Gravity With Higher Derivatives’ Gen Rel. Grav. 9, 353 (1978) Abstract.

[4] A. A. Starobinsky, ‘A New Type of Isotropic Cosmological Models Without Singularity’ Phys. Lett. B 91, 99 (1980)
Abstract; S. Kalara, N. Kaloper and K. A. Olive, ‘Theories of Inflation and Conformal Transformations’ Nucl. Phys. B 341, 252 (1990) Abstract; J. A. R. Cembranos, ‘The Newtonian Limit at Intermediate Energies’ Phys. Rev. D 73, 064029 (2006) Abstract.

[5] J. L. Feng, A. Rajaraman and F. Takayama, ‘Superweakly-Interacting Massive Particles’ Phys. Rev. Lett. 91, 011302 (2003)
Abstract; J. A. R. Cembranos,Jonathan L. Feng, Arvind Rajaraman, and Fumihiro Takayama,‘SuperWIMP Solutions to Small Scale Structure Problems’ Phys. Rev. Lett. 95, 181301 (2005) Abstract.

[6] B. J. Teegarden et al., 'INTEGRAL/SPI Limits on Electron-Positron Annihilation Radiation from the Galactic Plane’ Astrophys. J. 621, 296 (2005)
Article.

[7] J. F. Beacom and H. Yuksel, ‘Stringent Constraint on Galactic Positron Production’ Phys. Rev. Lett. 97, 071102 (2006)
Abstract.

[8] J. A. R. Cembranos, J. L. Feng and L. E. Strigari, ‘Resolving Cosmic Gamma Ray Anomalies with Dark Matter Decaying Now’ Phys. Rev. Lett. 99, 191301 (2007)
Abstract; J. A. R. Cembranos and L. E. Strigari, ‘Diffuse MeV Gamma-rays and Galactic 511 keV Line from Decaying WIMP Dark Matter’ Phys. Rev. D 77, 123519 (2008) Abstract.

[9] J. A. R. Cembranos, A. Dobado and A. L. Maroto, ‘Brane-World Dark Matter’ Phys. Rev. Lett. 90, 241301 (2003)
Abstract; ‘Dark Geometry’ Int. J. Mod. Phys. D 13, 2275 (2004) arXiv:hep-ph/0405165.

Labels: , , ,


Saturday, May 16, 2009

Using the Uncertainty Principle to Detect Entanglement of One Photon Shared Among Four Locations

Members of the Caltech team, from left to right, undergraduate Garrett Drayna, postdoctoral scholar Hui Deng, graduate student Kyung Soo Choi, and postdoctoral scholar Scott Papp.

A team of physicists at the California Institute of Technology has demonstrated a new method to detect entanglement in the form of one photon shared among four optical paths. Their work is reported in the May 8 issue of the journal Science [1]. In their experiments, led by H. Jeff Kimble, entanglement is detected using quantum uncertainty relations for the regime of discrete variables, in which photons are taken one by one. Their approach builds on the famous Heisenberg uncertainty principle that places a limit on the precision with which the momentum and position of a particle can be known simultaneously.

Link to Professor Jeff Kimble's Quantum Optics group at Caltech >>
Past 2Physics Articles on the work of this group>>

Entanglement, which lies at the heart of quantum physics, is a state in which the parts of a composite system are more strongly correlated than is possible for any classical counterparts, regardless of the distances separating them. Entanglement in a system with more than two parts, or multipartite entanglement, is a critical tool for diverse applications in quantum information science, such as for quantum metrology, computation, and communication. In the future, a ‘quantum internet’ will rely on entanglement for the teleportation of quantum states from place to place [2].

For some time physicists have studied bipartite entanglement, and techniques for classifying and detecting the entanglement between two parts of a composite system are well known. But that isn’t the case for multipartite states. Their classification is much richer, and detecting their entanglement is extremely challenging.

In the Caltech experiment, a pulse of light was generated containing a single photon—a massless bundle, with both wave-like and particle-like properties, that is the basic unit of electromagnetic radiation. The team split the single photon to generate an entangled state of light in which the quantum amplitudes of the photon propagate among four distinct paths, all at once. This so-called W state plays an important role in quantum information science.

To enable future applications of multipartite W states, the entanglement contained in them must be detected and characterized. This task is complicated by the fact that entanglement in W states can be found not only among all the parts, but also among a subset of them. To distinguish between these two cases in real-world experiments, collaborators Steven van Enk and Pavel Lougovski from the University of Oregon developed a novel approach to entanglement detection based on the uncertainty principle. (See also the recent theoretical article by van Enk, Lougovski, and the Caltech group [3].)

The new approach to entanglement detection used in the Caltech experiments makes use of non-local measurements of a photon propagating through all four paths. The measurements indicate whether a photon is present, but not which path it takes. From this information the scientists can estimate the level of correlation in the photon’s paths. Correlations above a certain level signify entanglement among all the paths – even partially entangled W states do not attain a similar level of correlation. A key feature of this approach is that only a relatively small number of measurements must be performed.

Due to their fundamental structure, the entanglement of W states persists even in the presence of some sources of noise. This is an important feature of W states for real-world applications conducted in noisy environments. The Caltech experiments have directly tested this property by disturbing the underlying correlations of the entangled state. When the correlations are purposely weakened, the Caltech team detects a reduction in the number of paths of the optical system that are entangled. Yet, as predicted by the structure of W states, the entanglement amongst a subset of the paths still remains.

The work was funded by the Intelligence Advanced Research Projects Activity, the National Science Foundation, and Northrop Grumman Space Technology.

References
[1] “Characterization of Multipartite Entanglement for One Photon Shared Among Four Optical Modes”

S. B. Papp, K. S. Choi, H. Deng, P. Lougovski, S. J. van Enk, and H. J. Kimble, Science 324, 764 (2009). Abstract.
[2] “The Quantum Internet” H. J. Kimble, Nature 453, 1023 (2008). Abstract.
[3] “Verifying multi-partite mode entanglement of W states”

P. Lougovski, S. J. van Enk, K. S. Choi, S. B. Papp, H. Deng, and H.J. Kimble at http://xxx.lanl.gov/abs/0903.0851.

Labels:


Saturday, May 09, 2009

Invisibility Cloak for Near-Infrared Light

Xiang Zhang [Photo courtesy: Roy Kaltschmidt/ Lawrence Berkeley National Laboratory]

In a paper published in 'Nature Materials' [1], a team led by Xiang Zhang, a principal investigator with Lawrence Berkeley National Laboratory’s Materials Sciences Division and director of UC Berkeley’s Nano-scale Science and Engineering Center, reported the creation of a “carpet cloak” from nanostructured silicon that conceals the presence of objects placed under it from optical detection in near-infrared region at a wavelength range of 1,400–1,800 nm.

Previous demonstrations of cloaking, where objects are rendered invisible at certain frequencies, have been limited to the microwave regime. This new development takes a significant step closer to achieving invisibility in a region that can be seen (no ... not 'seen' ...must be changed to 'experienced'!) by human eye.

In recent years, several theories for 'invisibility' schemes have been proposed for cloaking devices using transformation optics and conformal mapping. The necessary medium for enabling precise control over the flow of electromagnetic waves was provided by Metamaterial which gains its properties from its spatially tailored structure rather than directly from its composition. The first microwave cloaking with the use of metamaterials was demonstrated in 2006 by Duke University's Pratt School of Engineering [2] but the realization of cloaking at optical frequencies, a key step towards achieving actual invisibility, has eluded scientists mainly because the metal elements absorb too much light.

Previous work by Zhang and his group with invisibility devices involved complex metamaterials - composites of metals and dielectrics whose extraordinary optical properties arise from their unique structure rather than their composition. They constructed one material out of an elaborate fishnet of alternating layers of silver and magnesium fluoride, and another out of silver nanowires grown inside porous aluminum oxide. With these metallic metamaterials, Zhang and his group demonstrated that light can be bent backwards, a property unprecedented in nature.

“We have come up with a new solution to the problem of invisibility based on the use of dielectric (nonconducting) materials,” says Zhang. “Our optical cloak not only suggests that true invisibility materials are within reach, it also represents a major step towards transformation optics, opening the door to manipulating light at will for the creation of powerful new microscopes and faster computers.”

[Image courtesy: Thomas Zentgraf] These three images depict how light striking an object covered with the carpet cloak acts as if there were no object being concealed on the flat surface. In essence, the object has become invisible.

The optical 'carpet' cloak designed by the Berkeley team used quasi-conformal mapping to conceal an object that is placed under a curved reflecting surface by imitating the reflection of a flat surface. The cloak consists only of isotropic dielectric materials, which enables broadband and low-loss invisibility at a wavelength range of 1,400–1,800 nm. While the carpet itself can still be seen, the bulge of the object underneath it disappears from view. Shining a beam of light on the bulge shows a reflection identical to that of a beam reflected from a flat surface, meaning the object itself has essentially been rendered invisible.

[Video by Jensen Li] This video shows how a beam of light is obstructed by an object in a flat surface and casts a shadow until the object is cloaked, at which point the light is reflected as if the surface were still perfectly flat.

The new cloak created by Zhang and his team is made exclusively from dielectric materials, which are often transparent at optical frequencies. The cloak was demonstrated in a rectangular slab of silicon (250 nanometers thick) that serves as an optical waveguide in which light is confined in the vertical dimension but free to propagate in the other two dimensions. A carefully designed pattern of holes - each 110 nanometers in diameter - perforates the silicon, transforming the slab into a metamaterial that forces light to bend like water flowing around a rock. In the experiments reported in Nature Materials, the cloak was used to cover an area that measured about 3.8 microns by 400 nanometers. It demonstrated invisibility at variable angles of light incident.

[Image courtesy: Xiang Zhang] Image (a) is a schematic diagram showing the cloaked region (marked with green) which resides below the reflecting bump (carpet) and can conceal any arbitrary object by transforming the shape of the bump back into a virtually flat object. Image (b) was taken with a scanning electron microscope image of the carpet coated bump.

Right now the cloak operates for light between 1,400 and 1,800 nanometers in wavelength, which is the near-infrared portion of the electromagnetic spectrum, just slightly longer than light that can be seen with the human eye. However, because of its all dielectric composition and design, Zhang says the cloak is relatively easy to fabricate and should be upwardly scalable. He is also optimistic that with more precise fabrication this all dielectric approach to cloaking should yield a material that operates for visible light - in other words, true invisibility to the naked eye.

“Even with the advances that have been made in optical metamaterials, scaling sub-wavelength metallic elements and placing them in an arbitrarily designed spatial manner remains a challenge at optical frequencies", says Zhang,"In this experiment, we have demonstrated a proof of concept for optical cloaking that works well in two dimensions. Our next goal is to realize a cloak for all three dimensions, extending the transformation optics into potential applications.”

Reference
[1] "An optical cloak made of dielectrics"
Jason Valentine, Jensen Li, Thomas Zentgraf, Guy Bartal & Xiang Zhang,
Nature Materials, Published online: 29 April 2009 doi:10.1038/nmat2461.
Abstract
[2] "Metamaterial Electromagnetic Cloak at Microwave Frequencies"
D. Schurig, J. J. Mock, B. J. Justice, S. A. Cummer, J. B. Pendry, A. F. Starr, D. R. Smith,

Science, Vol. 314, pp. 977 - 980 (2006). Abstract

[We thank Lawrence Berkeley National Laboratory media relations for materials used in this posting]

Labels: , , ,


Saturday, May 02, 2009

Measurement and Control of ‘Forbidden’ Collisions between Fermions could improve Atomic Clock Accuracy

Jun Ye adjusts the laser setup for a strontium atomic clock in his laboratory at Joint Institute for Laboratory Astrophysics (JILA).
[Image credit: J. Burrus/NIST]

In a paper published in the journal Science, a team of researchers led by Jun Ye of JILA, a joint institute of the National Institute of Standards and Technology (NIST) and the University of Colorado (CU) at Boulder, reported measurement and control of seemingly 'forbidden' collisions between neutral strontium atoms — a class of fermions, which are not supposed to collide when in identical energy states. This breakthrough has made possible a significant boost in the accuracy of atomic clocks based on hundreds or thousands of neutral atoms.

Link to Jun Ye Group "AMO Physics & Precision Measurement" >>

JILA's strontium clock is one of several next-generation atomic clocks under development around the world. These experimental clocks are based on a variety of different atoms and designs, from single ions (electrically charged atoms) to thousands of neutral atoms; it is not yet clear which design will emerge as the best and be chosen as the future international time standard. The latest JILA work helps eliminate a significant drawback to clock designs based on ensembles of neutral atoms. The presence of many atoms increases both the precision and signal of a clock based on the oscillations between energy levels, or "ticks," in those atoms. However, uncontrolled interactions between atoms can perturb their internal energy states and shift the number of clock ticks per second, reducing overall accuracy.

For the past two years, the Jun Ye group has been developing an optical lattice atomic clock based on fermions such as a collection of identical strontium atoms (87Sr). The overall systematic uncertainty has reached below NIST-F1 -- the Cesium (Cs)-fountain clock operated by NIST,Boulder as the U.S. civilian time and frequency standard.

Fermions, according to the rules of quantum physics, cannot occupy the same energy state and location in space at the same time. Therefore, these identical strontium atoms are not supposed to collide. However, as the group improved the performance of their strontium clock over the past two years, they began to observe small shifts in the frequencies of the clock ticks due to atomic collisions. The extreme precision of their clock unveiled in 2008 (read past 2Physics report, April 17, 2008) enabled the group to measure these minute interactions systematically, including the dynamic effect of the measurement process itself, and to significantly reduce the resulting uncertainties in clock operation.

The JILA clock used in the latest experiments contains about 2,000 strontium atoms cooled to temperatures of a few microKelvin and trapped in multiple levels of a crisscrossed pattern of light, known as an optical lattice. The lattice is shaped like a tall stack of pancakes, or wells. About 30 atoms are grouped together in each well, and these neighboring atoms sometimes collide.

Ye's group discovered that two atoms located some distance apart in the same well are subjected to slight variations in the direction of the laser pulses used to boost the atoms from one energy level to another. The non-uniform interaction with light excites the atoms unevenly. Strontium atoms in different internal states are no longer completely identical, and become distinguishable enough to collide, if given a sufficient amount of time. This differential effect can be suppressed by making the atoms even colder or increasing the trap depth.

The probability of atomic collisions depends on the extent of the variation in the excitation of the ensemble of atoms. Significantly for clock operations, the JILA scientists determined that when the atoms are excited to about halfway between the ground state and the more energetic excited state, the collision-related shifts in the clock frequencies goes to zero. This knowledge enables scientists to reduce or even eliminate the need for a significant correction in the clock output, thereby increasing accuracy.

The discoveries described in Science also would apply to clocks using atoms known as bosons, which, unlike fermions, can exist in the same place and energy state at the same time. This category of clocks includes NIST-F1, the U.S. civilian time and frequency standard. In the case of bosons, variations in light-matter interactions would reduce (rather than increase) the probability of collisions.

Beyond atomic clocks, the high precision of JILA's strontium lattice experimental setup is expected to be useful in other applications requiring exquisite control of atoms, such as quantum computing—potentially ultra-powerful computers based on quantum physics—and simulations to improve understanding of other quantum phenomena such as superconductivity.

"There's a fundamental question here: Why do fermions actually collide?" Ye asks. "Now we understand why there is a frequency shift, and we can zero the shift ... [This result] does not change theory. The value is from the practical possibilities: We can control multi-particle interactions."

Reference
"Probing Interactions between Ultracold Fermions"
G.K. Campbell, M.M. Boyd, J.W. Thomsen, M.J. Martin, S. Blatt, M.D. Swallows, T.L. Nicholson, T. Fortier, C.W. Oates, S.A. Diddams, N.D. Lemke, P. Naidon, P. Julienne, J. Ye, and A. D. Ludlow
Science, Vol. 324, pp. 360-363 (2009)
Abstract

[We thank Media Relations, NIST, Boulder for materials used in this posting]

Labels: , ,