.comment-link {margin-left:.6em;}

2Physics

2Physics Quote:
"Many of the molecules found by ROSINA DFMS in the coma of comet 67P are compatible with the idea that comets delivered key molecules for prebiotic chemistry throughout the solar system and in particular to the early Earth increasing drastically the concentration of life-related chemicals by impact on a closed water body. The fact that glycine was most probably formed on dust grains in the presolar stage also makes these molecules somehow universal, which means that what happened in the solar system could probably happen elsewhere in the Universe."
-- Kathrin Altwegg and the ROSINA Team

(Read Full Article: "Glycine, an Amino Acid and Other Prebiotic Molecules in Comet 67P/Churyumov-Gerasimenko"
)

Sunday, April 25, 2010

Searching Dark Energy with Supernova Dataset

Saul Perlmutter [Photo courtesy: Lawrence Berkeley National Laboratory]

The international Supernova Cosmology Project (SCP), based at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, has announced the Union2 compilation of hundreds of Type Ia supernovae, the largest collection ever of high-quality data from numerous surveys. Analysis of the new compilation significantly narrows the possible values that dark energy might take—but not enough to decide among fundamentally different theories of its nature.

“We’ve used the world’s best-yet dataset of Type Ia supernovae to determine the world’s best-yet constraints on dark energy,” says Saul Perlmutter, leader of the SCP. “We’ve tightened in on dark energy out to redshifts of one”—when the universe was only about six billion years old, less than half its present age—“but while at lower redshifts the values are perfectly consistent with a cosmological constant, the most important questions remain.”

Two views of one of the six new distant supernovae in the Supernova Cosmology Project's just-released Union2 survey, which among other refinements compares ground-based infrared observations (in this case by Japan's Subaru Telescope on Mauna Kea) with follow-up observations by the Hubble Space Telescope [Image courtesy: Supernova Cosmology Project]

That’s because possible values of dark energy from supernovae data become increasingly uncertain at redshifts greater than one-half, the range where dark energy’s effects on the expansion of the universe are most apparent as we look farther back in time. Says Perlmutter of the widening error bars at higher redshifts, “Right now, you could drive a truck through them.”

As its name implies, the cosmological constant fills space with constant pressure, counteracting the mutual gravitational attraction of all the matter in the universe; it is often identified with the energy of the vacuum. If indeed dark energy turns out to be the cosmological constant, however, even more questions will arise.

“There is a huge discrepancy between the theoretical prediction for vacuum energy and what we measure as dark energy,” says Rahman Amanullah, who led SCP’s Union2 analysis; Amanullah is presently with the Oskar Klein Center at Stockholm University and was a postdoctoral fellow in Berkeley Lab’s Physics Division from 2006 to 2008. “If it turns out in the future that dark energy is consistent with a cosmological constant also at early times of the universe, it will be an enormous challenge to explain this at a fundamental theoretical level.”

Rahman Amanullah [Photo courtesy:Stockholm University]

A major group of competing theories posit a dynamical form of dark energy that varies in time. Choosing among theories means comparing what they predict about the dark energy equation of state, a value written w. While the new analysis has detected no change in w, there is much room for possibly significant differences in w with increasing redshift (written z).

“Most dark-energy theories are not far from the cosmological constant at z less than one,” Perlmutter says. “We’re looking for deviations in w at high z, but there the values are very poorly constrained.”

In their new analysis to be published in the Astrophysical Journal [1], the Supernova Cosmology Project reports on the addition of several well-measured, very distant supernovae to the Union2 compilation.

Dark energy fills the universe, but what is it?

Dark energy was discovered in the late 1990s by the Supernova Cosmology Project and the competing High-Z Supernova Search Team, both using distant Type Ia supernovae as “standard candles” to measure the expansion history of the universe. To their surprise, both teams found that expansion is not slowing due to gravity but accelerating.

Other methods for measuring the history of cosmic expansion have been developed, including baryon acoustic oscillation and weak gravitational lensing, but supernovae remain the most advanced technique. Indeed, in the years since dark energy was discovered using only a few dozen Type Ia supernovae, many new searches have been mounted with ground-based telescopes and the Hubble Space Telescope; many hundreds of Type Ia’s have been discovered; techniques for measuring and comparing them have continually improved.

In 2008 the SCP, led by the work of team member Marek Kowalski of the Humboldt University of Berlin, created a way to cross-correlate and analyze datasets from different surveys made with different instruments, resulting in the SCP’s first Union compilation. In 2009 a number of new surveys were added.

The inclusion of six new high-redshift supernovae found by the SCP in 2001, including two with z greater than one, is the first in a series of very high-redshift additions to the Union2 compilation now being announced, and brings the current number of supernovae in the whole compilation to 557.

“Even with the world’s premier astronomical observatories, obtaining good quality, time-critical data of supernovae that are beyond a redshift of one is a difficult task,” says SCP member Chris Lidman of the Anglo-Australian Observatory near Sydney, a major contributor to the analysis. “It requires close collaboration between astronomers who are spread over several continents and several time zones. Good team work is essential.”

Union2 has not only added many new supernovae to the Union compilation but has refined the methods of analysis and in some cases improved the observations. The latest high-z supernovae in Union2 include the most distant supernovae for which ground-based near-infrared observations are available, a valuable opportunity to compare ground-based and Hubble Space Telescope observations of very distant supernovae.

Type Ia supernovae are the best standard candles ever found for measuring cosmic distances because the great majority are so bright and so similar in brightness. Light-curve fitting is the basic method for standardizing what variations in brightness remain: supernova light curves (their rising and falling brightness over time) are compared and uniformly adjusted to yield comparative intrinsic brightness. The light curves of all the hundreds of supernova in the Union2 collection have been consistently reanalyzed.

The upshot of these efforts is improved handling of systematic errors and improved constraints on the value of the dark energy equation of state with increasing redshift, although with greater uncertainty at very high redshifts. When combined with data from cosmic microwave background and baryon oscillation surveys, the “best fit cosmology” remains the so-called Lambda Cold Dark Matter model, or ΛCDM.

ΛCDM has become the standard model of our universe, which began with a big bang, underwent a brief period of inflation, and has continued to expand, although at first retarded by the mutual gravitational attraction of matter. As matter spread and grew less dense, dark energy overcame gravity, and expansion has been accelerating ever since.

To learn just what dark energy is, however, will first require scientists to capture many more supernovae at high redshifts and thoroughly study their light curves and spectra. This can’t be done with telescopes on the ground or even by heavily subscribed space telescopes. Learning the nature of what makes up three-quarters of the density of our universe will require a dedicated observatory in space.

Reference
[1] "Spectra and Light Curves of Six Type Ia Supernovae at 0.511 < z < 1.12 and the Union2 Compilation"
Authors: R. Amanullah, C. Lidman, D. Rubin, G. Aldering, P. Astier, K. Barbary, M. S. Burns, A. Conley, K. S. Dawson, S. E. Deustua, M. Doi, S. Fabbro, L. Faccioli, H. K. Fakhouri, G. Folatelli, A. S. Fruchter, H. Furusawa, G. Garavini, G. Goldhaber, A. Goobar, D. E. Groom, I. Hook, D. A. Howell, N. Kashikawa, A. G. Kim, R. A. Knop, M. Kowalski, E. Linder, J. Meyers, T. Morokuma, S. Nobili, J. Nordin, P. E. Nugent, L. Ostman, R. Pain, N. Panagia, S. Perlmutter, J. Raux, P. Ruiz-Lapuente, A. L. Spadafora, M. Strovink, N. Suzuki, L. Wang, W. M. Wood-Vasey, N. Yasuda,
Accepted for publication in Astrophysical Journal, available at
arXiv:1004.1711v1.

[The text of this report is written by Paul Preuss of Lawrence Berkeley National Laboratory]

Labels: ,


Sunday, March 28, 2010

General Relativity Is Valid On Cosmic Scale

Uros Seljak [photo courtesy: University of California, Berkeley]

An analysis of more than 70,000 galaxies by University of California, Berkeley, University of Zurich and Princeton University physicists demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out 95 years ago by Albert Einstein in his General Theory of Relativity.

By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material, the researchers have shown that Einstein's theory explains the nearby universe better than alternative theories of gravity.

One major implication of the new study is that the existence of dark matter is the most likely explanation for the observation that galaxies and galaxy clusters move as if under the influence of some unseen mass, in addition to the stars astronomers observe.

A partial map of the distribution of galaxies in the Sloan Digital Sky Survey, going out to a distance of 7 billion light years. The amount of galaxy clustering that we observe today is a signature of how gravity acted over cosmic time, and allows as to test whether general relativity holds over these scales. (M. Blanton, Sloan Digital Sky Survey)

"The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe," said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. "Those alternative theories that do not require dark matter fail these tests."

In particular, the tensor-vector-scalar gravity (TeVeS) theory, which tweaks general relativity to avoid resorting to the existence of dark matter, fails the test.

The result conflicts with a report late last year that the very early universe, between 8 and 11 billion years ago, did deviate from the general relativistic description of gravity.

Seljak and his current and former students, including first authors Reinabelle Reyes, a Princeton University graduate student, and Rachel Mandelbaum, a recent Princeton Ph.D. recipient, report their findings in the March 11 issue of the journal Nature [1]. The other co-authors are Tobias Baldauf, Lucas Lombriser and Robert E. Smith of the University of Zurich, and James E. Gunn, professor of physics at Princeton and father of the Sloan Digital Sky Survey.

Einstein's General Theory of Relativity holds that gravity warps space and time, which means that light bends as it passes near a massive object, such as the core of a galaxy. The theory has been validated numerous times on the scale of the solar system, but tests on a galactic or cosmic scale have been inconclusive.

"There are some crude and imprecise tests of general relativity at galaxy scales, but we don't have good predictions for those tests from competing theories," Seljak said.

An image of a galaxy cluster in the Sloan Digital Sky Survey, showing some of the 70,000 bright elliptical galaxies that were analyzed to test general relativity on cosmic scales. (Sloan Digital Sky Survey)

Such tests have become important in recent decades because the idea that some unseen mass permeates the universe disturbs some theorists and has spurred them to tweak general relativity to get rid of dark matter. TeVeS, for example, says that acceleration caused by the gravitational force from a body depends not only on the mass of that body, but also on the value of the acceleration caused by gravity.

The discovery of dark energy, an enigmatic force that is causing the expansion of the universe to accelerate, has led to other theories, such as one dubbed f(R), to explain the expansion without resorting to dark energy.

Tests to distinguish between competing theories are not easy, Seljak said. A theoretical cosmologist, he noted that cosmological experiments, such as detections of the cosmic microwave background, typically involve measurements of fluctuations in space, while gravity theories predict relationships between density and velocity, or between density and gravitational potential.

"The problem is that the size of the fluctuation, by itself, is not telling us anything about underlying cosmological theories. It is essentially a nuisance we would like to get rid of," Seljak said. "The novelty of this technique is that it looks at a particular combination of observations that does not depend on the magnitude of the fluctuations. The quantity is a smoking gun for deviations from general relativity."

Three years ago, a team of astrophysicists led by Pengjie Zhang of Shanghai Observatory suggested using a quantity dubbed EG to test cosmological models. EG reflects the amount of clustering in observed galaxies and the amount of distortion of galaxies caused by light bending as it passes through intervening matter, a process known as weak lensing. Weak lensing can make a round galaxy look elliptical, for example.

"Put simply, EG is proportional to the mean density of the universe and inversely proportional to the rate of growth of structure in the universe," he said. "This particular combination gets rid of the amplitude fluctuations and therefore focuses directly on the particular combination that is sensitive to modifications of general relativity."

Using data on more than 70,000 bright, and therefore distant, red galaxies from the Sloan Digital Sky Survey, Seljak and his colleagues calculated EG and compared it to the predictions of TeVeS, f(R) and the cold dark matter model of general relativity enhanced with a cosmological constant to account for dark energy.

The predictions of TeVeS were outside the observational error limits, while general relativity fit nicely within the experimental error. The EG predicted by f(R) was somewhat lower than that observed, but within the margin of error.

In an effort to reduce the error and thus test theories that obviate dark energy, Seljak hopes to expand his analysis to perhaps a million galaxies when SDSS-III's Baryon Oscillation Spectroscopic Survey (BOSS), led by a team at LBNL and UC Berkeley, is completed in about five years. To reduce the error even further, by perhaps as much as a factor of 10, requires an even more ambitious survey called BigBOSS, which has been proposed by physicists at LBNL and UC Berkeley, among other places.

Future space missions, such as NASA's Joint Dark Energy Mission (JDEM) and the European Space Agency's Euclid mission, will also provide data for a better analysis, though perhaps 10-15 years from now.

Seljak noted that these tests do not tell astronomers the actual identity of dark matter or dark energy. That can only be determined by other types of observations, such as direct detection experiments.

Reference
[1] Reinabelle Reyes, Rachel Mandelbaum, Uros Seljak, Tobias Baldauf, James E. Gunn, Lucas Lombriser, Robert E. Smith, "Confirmation of general relativity on large scales from weak lensing and galaxy velocities", Nature, 464, 256-258 (2010).
Abstract.

[This report is written by Robert Sanders of University of California, Berkeley]

Labels: , , ,


Sunday, March 14, 2010

Gravitational Lenses Measure the Age and Size of the Universe



Phil Marshall (KIPAC, SLAC/Stanford) demonstrates lensing using a wine glass. [Video courtesy of Brad Plummer/Julie Karceski (SLAC)].


Using entire galaxies as lenses to look at other galaxies, researchers have a newly precise way to measure the size and age of the universe and how rapidly it is expanding, on a par with other techniques. The measurement determines a value for the Hubble constant, which indicates the size of the universe, and confirms the age of the universe as 13.75 billion years old, within 170 million years. The results also confirm the strength of dark energy, responsible for accelerating the expansion of the universe.

These results, by researchers at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at the US Department of Energy’s SLAC National Accelerator Laboratory and Stanford University, the University of Bonn, and other institutions in the United States and Germany, is published in the March 1 issue of The Astrophysical Journal [1]. This research was supported in part by the Department of Energy Office of Science. The authors of the paper are S. Suyu of the University of Bonn, P. Marshall of KIPAC, M. W. Auger (University of California, Santa Barbara), S. Hilbert (Argelander Institut für Astronomie and Max-Planck-Institut für Astrophysik), R. D. Blandford (KIPAC), L. V. E. Koopmans (Kapteyn Astronomical Institute), C. D. Fassnacht (University of California, Davis), and T. Treu (University of California, Santa Barbara).


Sherry Suyu describes the recent measurements of the age of the universe [Video Courtesy: uni-bonn.tv /University of Bonn, Germany]


The researchers used data collected by the NASA/ESA Hubble Space Telescope, and showed the improved precision they provide in combination with the Wilkinson Microwave Anisotropy Probe (WMAP).

The team used a technique called gravitational lensing to measure the distances light traveled from a bright, active galaxy to the earth along different paths. By understanding the time it took to travel along each path and the effective speeds involved, researchers could infer not just how far away the galaxy lies but also the overall scale of the universe and some details of its expansion.

Oftentimes it is difficult for scientists to distinguish between a very bright light far away and a dimmer source lying much closer. A gravitational lens circumvents this problem by providing multiple clues as to the distance light travels. That extra information allows them to determine the size of the universe, often expressed by astrophysicists in terms of a quantity called Hubble's constant.

When a large nearby object, such as a galaxy, blocks a distant object, such as another galaxy, the light can detour around the blockage. But instead of taking a single path, light can bend around the object in one of two, or four different routes, thus doubling or quadrupling the amount of information scientists receive. As the brightness of the background galaxy nucleus fluctuates, physicists can measure the ebb and flow of light from the four distinct paths, such as in the B1608+656 system imaged above. [Image courtesy Sherry Suyu of the Argelander Institut für Astronomie in Bonn, Germany]


"We've known for a long time that lensing is capable of making a physical measurement of Hubble's constant," KIPAC's Phil Marshall said. However, gravitational lensing had never before been used in such a precise way. This measurement provides an equally precise measurement of Hubble's constant as long-established tools such as observation of supernovae and the cosmic microwave background. "Gravitational lensing has come of age as a competitive tool in the astrophysicist's toolkit," Marshall said.

When a large nearby object, such as a galaxy, blocks a distant object, such as another galaxy, the light can detour around the blockage. But instead of taking a single path, light can bend around the object in one of two, or four different routes, thus doubling or quadrupling the amount of information scientists receive. As the brightness of the background galaxy nucleus fluctuates, physicists can measure the ebb and flow of light from the four distinct paths, such as in the B1608+656 system that was the subject of this study. Lead author on the study Sherry Suyu, from the University of Bonn, said, "In our case, there were four copies of the source, which appear as a ring of light around the gravitational lens."

Though researchers do not know when light left its source, they can still compare arrival times. Marshall likens it to four cars taking four different routes between places on opposite sides of a large city, such as Stanford University to Lick Observatory, through or around San Jose. And like automobiles facing traffic snarls, light can encounter delays, too.

"The traffic density in a big city is like the mass density in a lens galaxy," Marshall said. "If you take a longer route, it need not lead to a longer delay time. Sometimes the shorter distance is actually slower."

The gravitational lens equations account for all the variables such as distance and density, and provide a better idea of when light left the background galaxy and how far it traveled.

In the past, this method of distance estimation was plagued by errors, but physicists now believe it is comparable with other measurement methods. With this technique, the researchers have come up with a more accurate lensing-based value for Hubble's constant, and a better estimation of the uncertainty in that constant. By both reducing and understanding the size of error in calculations, they can achieve better estimations on the structure of the lens and the size of the universe.

There are several factors scientists still need to account for in determining distances with lenses. For example, dust in the lens can skew the results. The Hubble Space Telescope has infra-red filters useful for eliminating dust effects. The images also contain information about the number of galaxies lying around the line of vision; these contribute to the lensing effect at a level that needs to be taken into account.

Marshall says several groups are working on extending this research, both by finding new systems and further examining known lenses. Researchers are already aware of more than twenty other astronomical systems suitable for analysis with gravitational lensing.

Reference
S. H. Suyu, P. J. Marshall, M. W. Auger, S. Hilbert, R. D. Blandford, L. V. E. Koopmans, C. D. Fassnacht and T. Treu, "Dissecting the Gravitational Lens B1608+656. II. Precision Measurements Of The Hubble Constant, Spatial Curvature, and the Dark Energy Equation Of State", The Astrophysical Journal, v711, p201 (2010). Abstract.


[This report is written by Julie Karceski, SLAC National Accelerator Laboratory]

Labels: ,


Saturday, October 03, 2009

BOSS – A New Kind of Search for Dark Energy

David Schlegel, principal investigator of BOSS, shows one of the numerous “plug plates” used to map and select hundreds of galaxies for each exposure. Light from each galaxy enters a hole in the plate and is carried to the CCD camera by its own optical fiber [Image Courtesy: Lawrence Berkeley National Laboratory]

On the night of September 14 the largest program in the Sloan Digital Sky Survey-III, the Baryon Oscillation Spectroscopic Survey (BOSS) achieved 'first light' with an upgraded spectrographic system across the entire focal plane of the Sloan Foundation 2.5-meter telescope at Apache Point Observatory in New Mexico.

BOSS is the most ambitious attempt yet to map the expansion history of the Universe using the technique known as baryon acoustic oscillation (BAO). It is the largest of four surveys in SDSS-III, with 160 participants from among SDSS-III’s 350 scientists and 42 institutions.

“Baryon oscillation is a fast-maturing method for measuring dark energy in a way that’s complementary to the proven techniques of supernova cosmology,” says David Schlegel, Principal Investigator of BOSS. “The data from BOSS will be some of the best ever obtained on the large-scale structure of the Universe.”

The distribution of visible mass in the universe

“Baryon” (meaning protons and neutrons and other relatively massive particles) is shorthand for ordinary matter. For almost the first 400,000 years, the universe was so dense that particles of matter were thoroughly entangled with particles of light (photons), the whole a vast, quivering, liquid-like blob where density variations caused sound waves (pressure waves) to move spherically outward at over half the speed of light.

Suddenly the expanding universe cooled enough for light and matter to “decouple.” Photons shot through transparent space unimpeded; the speed of sound plummeted. What had been variations in the density of the liquid universe left two marks in the now-transparent sky.

Variations in the temperature of the radiation that filled the early universe have descended to us as anisotropies in the cosmic microwave background (CMB). Variations in the density of matter persist in the clustering of galaxies, as baryon acoustic oscillations (BAO). The two scales, the roughly one-degree anisotropy of the CMB and the 500-million-light-year clustering of BAO, are closely related; the standard ruler of the universe measured from BAO can be calculated from the CMB for any epoch since decoupling.

Anisotropies in the cosmic microwave background, originating when the universe was less than 400,000 years old, are directly related to variations in the density of galaxies as observed today [Image courtesy: Lawrence Berkeley National Laboratory]

Schlegel and his colleague Nikhil Padmanabhan, who came to Berkeley Lab from Princeton in late 2006, first used the SDSS telescope to complete the largest three-dimensional map of the universe ever made until then: 8,000 square degrees of sky out to a distance of 5.6 billion light years, determining the clustering of 60,000 luminous red galaxies. This program, part of SDSS-II, measured galactic distances to a redshift of z = 0.35 and detected the 500-million-light-year scale of BAO.

Measuring baryon oscillations

Baryon oscillations began as pressure waves propagated through the hot plasma of the early universe, creating regions of varying density that can be read today as temperature variations in the cosmic microwave background. The same density variations left their mark as the Universe evolved, in the periodic clustering of visible matter in galaxies, quasars, and intergalactic gas, as well as in the clumping of invisible dark matter.

Comparing these scales at different eras makes it possible to trace the details of how the Universe has expanded throughout its history – information that can be used to distinguish among competing theories of dark energy.

BOSS will measure 1.4 million luminous red galaxies at redshifts up to 0.7 (when the Universe was roughly seven billion years old) and 160,000 quasars at redshifts between 2.0 and 3.0 (when the Universe was only about three billion years old). BOSS will also measure variations in the density of hydrogen gas between the galaxies. The observation program will take five years.

“BOSS will survey the immense volume required to obtain percent-level measurements of the BAO scale and transform the BAO technique into a precision cosmological probe,” says survey scientist Martin White. “The high precision, enormous dynamic range, and wide redshift span of the BOSS clustering measurements translate into a revolutionary data set, which will provide rich insights into the origin of cosmic structure and the contents of the Universe.”

The spectrum of one of the quasars captured in the BOSS "first light" exposure (image: Vaishali Bhardwaj, David Hogg, Nic Ross - click for best resolution)

Existing SDSS spectrographs were upgraded to include new red cameras more sensitive to the red portion of the spectrum, featuring CCDs designed and fabricated at Berkeley Lab, with much higher efficiency than standard astronomical CCDs in the near infrared.

“Visible light emitted by distant galaxies arrives at Earth redshifted into the near-infrared, so the improved sensitivity of these CCDs allows us to look much further back in time,” says BOSS instrument scientist Natalie Roe.

To make these measurements BOSS will craft two thousand metal plates to fit the telescope’s focal plane, plotting the precise locations of two million objects across the northern sky. Each morning astronomers begin plugging optical fibers into a thousand tiny holes in each of the “plug plates” to carry the light from each specific target object to an array of spectrographs.

Steering each optical fiber to the right CCD was no trivial task, says Schlegel. “The new BOSS fiber cartridges are snake pits of a thousand fibers each. It would be a disaster if you didn’t know which one went where.”

One of the BOSS cartridges containing 1,000 optical fibers, which guide light from specific target galaxies and quasars to the spectrograph; Sloan Foundation telescope in background (photo by Dan Long, Apache Point Observatory - click on image for best resolution)

With a thousand holes in each plug plate, stopping to seek out specific holes to plug a fiber into, or tracing where each fiber ends up, would take an impossibly long time. Instead a computer assigns the correct target identity to each fiber as a fiber-mapping laser beam moves over the plugged-in fibers and records where the light from each emerges.

Fast and simple – but not quite foolproof. “In our first test images it looked like we’d just taken random spectra from all over,” Schlegel says. After some hair-pulling, the problem turned out to be simple. “After we flipped the plus and minus signs in the program, everything worked perfectly.”

Now BOSS is on its way to generating data of unprecedented precision on two million galaxies and quasars, and density variations in the intergalactic gas. The SDSS tradition of releasing data to the public will continue, with the first release from SDSS-III planned for December 2010.

More about BOSS:
[1] BOSS homepage
[2] "BOSS: The Baryon Oscillation Spectroscopic Survey", Nikhil Padmanabhan, David Schlegel, Natalie Roe, Martin White, Daniel Eisenstein and David Weinberg (a white paper describing the BOSS experiment).
[3] "Baryon Acoustic Oscillations (BAO) at LBNL", David Schlegel (a presentation at Lawrence Berkeley National Laboratory Physics Division Review, Nov 2006).
[4] The Sloan Digital Sky Survey III homepage.

[We thank Lawrence Berkeley National Laboratory for materials used in this posting]

Labels:


Saturday, May 23, 2009

The Shadows of Gravity

Jose A. R. Cembranos

[This is an invited article based on the author's recently published work -- 2Physics.com]

Author: Jose A. R. Cembranos
Affiliation:
William I. Fine Theoretical Physics Institute, University of Minnesota in Minneapolis, USA

Many authors have tried to explain the dark sectors of the cosmological model as modifications of Einstein’s gravity (EG). Dark Matter (DM) and Dark Energy (DE) are the main sources of the cosmological evolution at late times. They dominate the dynamics of the Universe at low densities or low curvatures. Therefore it is reasonable to expect that an infrared (IR) modification of EG can lead to a possible solution of these puzzles. However, it is in the opposite limit, at high energies (HE), where EG needs corrections from a quantum approach. These natural ultraviolet (UV) modifications of gravity are usually thought to be related to inflation or to the Big Bang singularity. In a recent work, I have shown that DM can be explained with HE modifications of EG. I have used an explicit model: R2 gravity and study its possible experimental signatures [1].

Einstein’s General Relativity describes the classical gravitational interaction in a very successful way by the metric tensor of the space-time through the Einstein-Hilbert action (EHA). This theory is particularly beautiful and the action particularly simple, since it contains only one term proportional to the scalar curvature. The proportionality parameter which multiplies this term, defines the Newton’s constant of gravitation and the typical scale of gravity. This magnitude is known as the Planck scale and its approximated energy value is 1019 Giga-electronvolts, which is equivalent to a distance of 10-35 meters.

However, the inconsistency of quantum computations within the gravitational theory described by the EHA demands its modification at HE. Quantum radiative corrections produced by standard matter provide divergent terms that are constant, linear, and quadratic in the Riemann curvature tensor of the space-time. The constant divergence can be regularized by the renormalization of the cosmological constant, which may explain the Dark Energy. The linear term is absorbed in the renormalization of the Planck scale itself. On the contrary, the quadratic terms are not included in the standard gravitational action. If these quantum corrections are not cancelled by invoking new symmetries, these terms need to be taken into account for the study of gravity at HE [2]. Indeed, these terms are also produced by radiative corrections coming from the own EG. Unfortunately, the gravitational corrections do not stop at this order as the associated with the matter content. There are cubic terms, quartic terms, etc. All these local quantum corrections are divergent and the fact that there is a non finite number of them implies that the theory is non-renormalizable. We know how to deal with gravity as an effective field theory, working order by order, but we cannot access higher energies than the Planck scale by using this effective approach [2]. In any case, the Planck scale is very high, and unreachable experimentally so far.

Inspired by this effective field theory point of view, which identifies higher energy corrections with higher curvature terms, I have studied the viability of a solution to the missing matter problem from the UV completion of gravity. As I have explained above, the first HE modification to EG is provided by the inclusion of quadratic terms in the curvature of the space-time geometry. The most general quadratic action supports, in addition to the usual massless spin-two graviton, a massive spin-two and a massive scalar mode, with a total of eight degrees of freedom (in the physical gauge [3]). In fact, this gravitational theory is renormalizable [3]. However, the massive spin-two gravitons are ghost-like particles that generate new unitarity violations, breaking of causality, and important instabilities.

In any case, there is a non-trivial quadratic extension of EG that is free of ghosts and phenomenologically viable. It is the so called R2 gravity since it is defined by the only addition of a term proportional to the square of the scalar curvature to the EHA. This term by itself does not improve the UV behaviour of EG but illustrates the idea in a minimal way. This particular HE modification of EG introduces a new scalar graviton that can provide the solution to the DM problem.

In this model, the new scalar graviton has a well defined coupling to the standard matter content and it is possible to study its phenomenology and experimental signatures [1] [3][4]. Indeed, this DM candidate could be considered as a superweakly interacting massive particle (superWIMP [5]) since its interactions are gravitational, i.e. it couples universally to the energy-momentum tensor with Planck suppressed couplings. It means that the new scalar graviton mediates an attractive Yukawa force between two non-relativistic particles with strength similar to Newton’s gravity. Among other differences, this new component of the gravitational force has a finite range, shorter than 0.1 millimeters, since the new scalar graviton is massive.

This is the most constraining lower bound on the mass of the scalar mode and it is independent of any supposition about its abundance. On the contrary, depending on its contribution to the total amount of DM, its mass is constrained from above. I have shown that it cannot be much heavier than twice the mass of the electron. If that is the case, this graviton decays in an electron-positron pair. These positrons annihilate producing a flux of gamma rays that we should have observed. In fact, the SPI spectrometer on the INTEGRAL (International Gamma-ray Astrophysics Laboratory) satellite, has observed a flux of gamma rays coming from the galactic centre (GC), whose characteristics are fully consistent with electron-positron annihilation [6].

If the mass of the new graviton is tuned close to the electron-positron production threshold, this line could be the first observation of R2 gravity. The same gravitational DM can explain this observation with a less tuned mass and a lower abundance. For heavier masses, the gamma ray spectrum originated by inflight annihilation of the positrons with interstellar electrons is even more constraining than the 511 keV photons [7].

On the contrary, for lighter masses, the only decay channel that may be observable is in two photons. It is difficult to detect these gravitational decays in the isotropic diffuse photon background (iDPB) [8]. A most promising analysis is associated with the search of gamma-ray lines from localized sources, as the GC. The iDPB is continuum since it suffers the cosmological redshift, but the mono-energetic photons originated by local sources may give a clear signal of R2 gravity [1].

In conclusion, I have analyzed the possibility that the DM origin resides in UV modifications of gravity [1]. Although, strictly speaking, my results are particular of R2 gravity, I think they are qualitatively general with a minimum set of assumptions about the gravitational sector. In any case, different approaches to try to link our ignorance about gravitation with the dark sectors of standard cosmology can be taken [9], and it is a very interesting subject which surely deserves further investigations.

This work is supported in part by DOE Grant No. DOE/DE-FG02-94ER40823, FPA 2005-02327 project (DGICYT, Spain), and CAM/UCM 910309 project.

References

[1] J. A. R. Cembranos, ‘Dark Matter from R2 Gravity’ Phys. Rev. Lett. 102, 141301 (2009).
Abstract

[2] N. D. Birrell and P. C. W. Davies, 'Quantum Fields In Curved Space’ (Cambridge Univ. Pr, 1982); J. F.Donoghue, ‘General Relativity As An Effective Field Theory: The Leading Quantum Corrections’ Phys. Rev. D 50, 3874 (1994)
Abstract; A. Dobado, et al., ‘Effective lagrangians for the standard model’ (Springer-Verlag, 1997).

[3] K. S. Stelle, ‘Renormalization Of Higher Derivative Quantum Gravity’ Phys. Rev. D 16, 953 (1977)
Abstract; K.S. Stelle, ‘Classical Gravity With Higher Derivatives’ Gen Rel. Grav. 9, 353 (1978) Abstract.

[4] A. A. Starobinsky, ‘A New Type of Isotropic Cosmological Models Without Singularity’ Phys. Lett. B 91, 99 (1980)
Abstract; S. Kalara, N. Kaloper and K. A. Olive, ‘Theories of Inflation and Conformal Transformations’ Nucl. Phys. B 341, 252 (1990) Abstract; J. A. R. Cembranos, ‘The Newtonian Limit at Intermediate Energies’ Phys. Rev. D 73, 064029 (2006) Abstract.

[5] J. L. Feng, A. Rajaraman and F. Takayama, ‘Superweakly-Interacting Massive Particles’ Phys. Rev. Lett. 91, 011302 (2003)
Abstract; J. A. R. Cembranos,Jonathan L. Feng, Arvind Rajaraman, and Fumihiro Takayama,‘SuperWIMP Solutions to Small Scale Structure Problems’ Phys. Rev. Lett. 95, 181301 (2005) Abstract.

[6] B. J. Teegarden et al., 'INTEGRAL/SPI Limits on Electron-Positron Annihilation Radiation from the Galactic Plane’ Astrophys. J. 621, 296 (2005)
Article.

[7] J. F. Beacom and H. Yuksel, ‘Stringent Constraint on Galactic Positron Production’ Phys. Rev. Lett. 97, 071102 (2006)
Abstract.

[8] J. A. R. Cembranos, J. L. Feng and L. E. Strigari, ‘Resolving Cosmic Gamma Ray Anomalies with Dark Matter Decaying Now’ Phys. Rev. Lett. 99, 191301 (2007)
Abstract; J. A. R. Cembranos and L. E. Strigari, ‘Diffuse MeV Gamma-rays and Galactic 511 keV Line from Decaying WIMP Dark Matter’ Phys. Rev. D 77, 123519 (2008) Abstract.

[9] J. A. R. Cembranos, A. Dobado and A. L. Maroto, ‘Brane-World Dark Matter’ Phys. Rev. Lett. 90, 241301 (2003)
Abstract; ‘Dark Geometry’ Int. J. Mod. Phys. D 13, 2275 (2004) arXiv:hep-ph/0405165.

Labels: , , ,


Saturday, April 18, 2009

Cosmology: 5 Needed Breakthroughs
-- Alexander Vilenkin

Alexander Vilenkin [photo courtesy: Institute of Cosmology, Tufts University]

[In our ongoing feature '5-Breakthroughs' we invited today Prof. Alexander Vilenkin, Director of Institute of Cosmology and L. and J. Bernstein Professor of Evolutionary Science at Tufts University.

Prof. Vilenkin's current research interests cover a wide range of subtopics in cosmology, quantum field theory and gravitation: cosmic inflation, dark energy, cosmic strings and monopoles, quantum cosmology, high energy cosmic rays, the multiverse, anthropic selection etc.

He received his undergraduate degree in physics in 1971 at Kharkov State University in the former Soviet Union. In 1976 he emigrated to USA and received his PhD at SUNY Buffalo in 1977. In 1978 he joined the faculty at Tufts.

During what has been a very productive and creative span of last thirty-five years, Prof. Vilenkin wrote over 200 research papers and contributed some crucial components of modern cosmology. His work on cosmic strings has been pivotal and his ideas on 'eternal inflation' and 'quantum creation of the universe from nothing' paved the path for new fields of investigation. Occasionally, he also took time to work on condensed matter physics and even topics like statistical analysis of DNA sequences. His work has been featured in numerous newspaper and magazine articles all over the world, as well as in many popular books. Here is a link to a list of his published work:
Google Scholar.

Prof. Vilenkin is a Fellow of the American Physical Society. During 1984-89, he received Presidential Young Investigator award from National Science Foundation.

In 1994 he (with P. Shellard) wrote a monograph on "Cosmic Strings and Other Topological Defects" (Cambridge University Press, 1994). In 2006 he authored the well-acclaimed book "Many Worlds in One: The Search for Other Universes" (Hill & Wang, 2006) which has been translated into many languages.

It gives us lot of pleasure for having the opportunity of presenting to you this list of 5 breakthroughs that Prof. Vilenkin would like to see in the field of Cosmology.

-- 2Physics.com]

1. Cosmic superstrings. Some superstring inspired cosmological models predict the existence of fundamental strings of astronomical dimensions. Discovery of cosmic superstrings may be the only way to test superstring theory by direct observation. In fact, discovery of cosmic strings of any kind ("super" or not) would be a great breakthrough, since it will open new windows into particle physics of ultra-high energies and into the early universe cosmology.

2. Further evidence for inflation. We have substantial evidence for cosmic inflation, but the details are very uncertain and a large number of models are consistent with the data. Discovery of gravitational waves from inflation or of non-Gaussian features in the cosmic microwave background would be important breakthroughs in this area.

3. Evidence for the multiverse. Inflationary cosmology leads to the multiverse picture, with multiple "bubble universes" expanding and occasionally colliding with one another. Collisions of our bubble with others may have observational signatures in cosmic microwave background and in gravitational waves. A discovery of such a collision would provide a direct evidence for the existence of the multiverse.

4. Solution to the measure problem. This is a perplexing problem in inflationary cosmology. Inflation is generically eternal, and bubble universes like ours are constantly being produced. Anything that can happen will happen in the eternally inflating universe, and it will happen an infinite number of times. We have to learn how to compare these infinities, since otherwise we cannot distinguish probable events from highly improbable, which makes it hard to make any predictions at all.

5. Discovery of supersymmetry. Non-discovery at Large Hadron Collider (LHC) would also have important implications.

You may be wondering why "dark energy" is not on my list. This is because I believe it is cosmological constant. But if I am wrong, and the dark energy density is changing with time, the discovery of this fact would be a great breakthrough.

Labels: , ,


Thursday, February 14, 2008

Distortions in Large Scale Structures of Galaxies Shed New Light On Dark Energy

Large Scale Structures of galaxies [Image courtesy: European Organisation for Astronomical Research in the Southern Hemisphere (ESO)]

An international team of 51 scientists from 24 institutions took the challenge of performing the measurement of the distribution and motions of thousands of galaxies in the distant Universe using ESO's Very Large Telescope in order to better understand what drives the acceleration of the cosmic expansion. In a recent paper in 'Nature', they report their results which shed new light on the mysterious dark energy that is thought to permeate the Universe.

Ten years ago, astronomers made the stunning discovery that the Universe is expanding at a faster pace today than it did in the past. "Explaining why the expansion of the Universe is currently accelerating is certainly the most fascinating question in modern cosmology," says Luigi Guzzo, lead author of the 'Nature' paper, "We have been able to show that large surveys that measure the positions and velocities of distant galaxies provide us with a new powerful way to solve this mystery."

"The expansion implies that one of two very different possibilities must hold true," explains Enzo Branchini, member of the team, "Either the Universe is filled with a mysterious dark energy which produces a repulsive force that fights the gravitational brake from all the matter present in the Universe, or, our current theory of gravitation is not correct and needs to be modified, for example by adding extra dimensions to space."

Current observations of the expansion rate of the Universe cannot distinguish between these two options. But this international team based their clever technique on a well-known phenomenon, namely the fact that the apparent motion of distant galaxies results from two effects: the global expansion of the Universe that pushes the galaxies away from each other and the gravitational attraction of matter present in the galaxies' neighbourhood that pulls them together, creating the cosmic web of large-scale structures.

"By measuring the apparent velocities of large samples of galaxies over the last thirty years, astronomers have been able to reconstruct a three-dimensional map of the distribution of galaxies over large volumes of the Universe. This map revealed large-scale structures such as clusters of galaxies and filamentary superclusters," says Olivier Le Fèvre, member of the team, "But the measured velocities also contain information about the local motions of galaxies; these introduce small but significant distortions in the reconstructed maps of the Universe. We have shown that measuring this distortion at different epochs of the Universe's history is a way to test the nature of dark energy."

Guzzo and his collaborators have been able to measure this effect by using the VIMOS spectrograph on Melipal, one of the four 8.2-m telescopes that is part of ESO's VLT. As part of the VIMOS-VLT Deep Survey (VVDS), spectra of several thousands of galaxies in a 4-square-degree field (or 20 times the size of the full Moon) at epochs corresponding to about half the current age of the Universe (about 7 billion years ago) were obtained and analysed. This is the largest field ever covered homogeneously by means of spectroscopy to this depth. The team has now collected more than 13,000 spectra in this field and the total volume sampled by the survey is more than 25 million cubic light-years.

The astronomers compared their result with that of the 2dFGRS survey that probed the local Universe, i.e. measures the distortion at the present time. Within current uncertainties, the measurement of this effect provides an independent indication of the need for an unknown extra energy ingredient in the 'cosmic soup', supporting the simplest form of dark energy, the so-called cosmological constant, introduced originally by Albert Einstein. The large uncertainties do not yet exclude the other scenarios, though.

"We have also shown that by extending our measurements over volumes about ten times larger than the VVDS, this technique should be able to tell us whether cosmic acceleration originates from a dark energy component of exotic origin or requires a modification of the laws of gravity," explains Guzzo.

Reference
"A test of the nature of cosmic acceleration using galaxy redshift distortions",
L. Guzzo, M. Pierleoni, B. Meneux, E. Branchini, O. Le Fèvre, C. Marinoni, B. Garilli, J. Blaizot, G. De Lucia, A. Pollo, H. J. McCracken, D. Bottini, V. Le Brun, D. Maccagni, J. P. Picat, R. Scaramella, M. Scodeggio, L. Tresse, G. Vettolani, A. Zanichelli, C. Adami, S. Arnouts, S. Bardelli, M. Bolzonella, A. Bongiorno, A. Cappi, S. Charlot, P. Ciliegi, T. Contini, O. Cucciati, S. de la Torre, K. Dolag, S. Foucaud, P. Franzetti, I. Gavignaud, O. Ilbert, A. Iovino, F. Lamareille, B. Marano, A. Mazure, P. Memeo, R. Merighi, L. Moscardini, S. Paltani, R. Pellò, E. Perez-Montero, L. Pozzetti, M. Radovich, D. Vergani, G. Zamorani & E. Zucca,
Nature 451, 541-544 (31 January 2008), Abstract Link

Labels:


Friday, January 04, 2008

High Energy Physics : 5 Needed Breakthroughs
-- Mark Wise

Mark Wise[In the ongoing feature '5 Breakthroughs', our guest today is Mark Wise, the John A. McCone Professor of High Energy Physics at California Institute of Technology.

Prof. Wise is a fellow of the American Physical Society, and member of the American Academy of Arts and Sciences and the National Academy of Sciences. He was a fellow of the Alfred P. Sloan Foundation from 1984 to 1987.

Although Prof. Wise has done some research in cosmology and nuclear physics, his interests are primarily in theoretical elementary particle physics. Much of his research has focused on the nature and implications of the symmetries of the strong and weak interactions. He is best known for his role in the development of heavy quark effective theory (HQET), a mathematical formalism that has allowed physicists to make predictions about otherwise intractable problems in the theory of the strong nuclear interactions.

To provide a background of his current research activities, Prof. Wise said,"Currently we have a theory for the strong, weak and electromagnetic interactions of elementary particles that has been extensively tested in experiments. It is usually called the standard model. Even with this theory many features of the data are not explained. For example, the quark and lepton masses are free parameters in the standard model and are not predicted. Furthermore the theory has some unattractive aspects -- the most noteworthy of them being the extreme fine tuning needed to keep the Higgs mass small compared to the ultraviolet cutoff for the theory. This is sometimes called the hierarchy problem."

He explained,"My own research breaks into two parts. One part is using the standard model to predict experimental observables. Just because you have a theory doesn’t mean it’s straightforward to use it to compare with experiment. Usually such comparisons involve expansions in some small quantity. One area I have done considerable research on is the development of methods to make predictions for the properties of hadrons that contain a single heavy quark".

He elaborated,"The other part is research on physics that is beyond what is in the standard model. In particular I have worked on the development of several extensions of the standard model that solve the hierarchy problem: low energy supersymmetry, the Randall-Sundrum model and most recently the Lee-Wick standard model. This work is very speculative. It is possible that none of the extensions of the standard model discussed in the scientific literature are realized in nature."

Prof. Wise shared the 2001 Sakurai Prize for Theoretical Particle Physics with Nathan Isgur and Mikhail Voloshin. The citation mentioned his work on "the construction of the heavy quark mass expansion and the discovery of the heavy quark symmetry in quantum chromodynamics, which led to a quantitative theory of the decays of c and b flavored hadrons."

He obtained his PhD from Stanford University in 1980. While doing his thesis work, he also co-authored the book 'From Physical Concept to Mathematical Structure: an Introduction to Theoretical Physics' (U. Toronto Press, 1980) with Prof Lynn Trainor of the University of Toronto (where he did his B.S. in 1976 and M.S. in 1977). He also coauthored, with Aneesh Manohar, a monograph on 'Heavy Quark Physics' (Cambridge Univ Press, 2000).

We are pleased to present the list of 5 needed breakthroughs that Prof. Mark Wise would be happy to see in the field of high energy physics.
-- 2Physics.com]

"Here go five breakthroughs that would be great to see:

1) An understanding of the mechanism that breaks the weak interaction symmetry giving the W's and Z's mass. This we should know the answer to in my lifetime since it will be studied at the LHC (Large Hadron Collider) and I am trying to stay healthy.

2) Reconciling gravity with quantum mechanics. Currently the favored candidate for a quantum theory of gravity is String Theory. However, there is no evidence from experiment that this is the correct theory. Perhaps quantum mechanics itself gives way to a more fundamental theory at extremely short distances.

3) An answer to the question, why is the value of the cosmological constant so small? I am assuming here that dark energy is a cosmological constant. (Hey if it looks like a duck and quacks like a duck it's probably a duck.) A cosmological constant is a very simple term in the effective low energy Lagrangian for General Relativity. The weird thing about dark energy is not what it is but rather why it's so small.

4) An understanding of why the scale at which the weak symmetry is broken is so small compared to the scale at which quantum effects in gravity become strong. This is usually called the hierarchy problem. Breakthrough (1) might provide the solution to the hierarchy problem or it might not.

5) Discovery of the particle that makes up the dark matter of the universe and the measurement of its properties (e.g., spin, mass, ...).

There are other things I would love to know. For example, is there a way to explain the values of the quark and lepton masses? But you asked for five."

Labels: , , , ,


Monday, December 17, 2007

Particle Astrophysics: 5 Needed Breakthroughs
-- James Hough

James HoughJames Hough [Photo Courtesy: Institute for Gravitational Research, University of Glasgow]

[Today's guest in our ongoing feature '5-Breakthroughs' is James Hough, Director of the Institute for Gravitational Research, and Professor of Experimental Physics in the Department of Physics and Astronomy, University of Glasgow.

Prof. Hough is also the Chairperson of
Gravitational Wave International Committee (GWIC) which was formed in 1997 by the directors and representatives of projects and research groups around the world whose research is aimed at the detection of gravitational radiation. The purpose of GWIC is to encourage coordination of research and development across the groups and collaboration in the scheduling of detector operation and data analysis. GWIC also advises on the location, timing and programme of the Edoardo Amaldi Conferences on Gravitational Waves which are held every 2 years, and presents a prize for the best Ph.D. thesis submitted each year (for details, visit 'GWIC Thesis Prize')

His current research interests are in the investigation of materials for test masses and mirror coatings, and in the development of suspension systems of ultra-low mechanical loss towards
a) second generation gravitational wave detectors, in particular Advanced
LIGO – upgrade to the US LIGO gravitational wave detector systems (Advanced LIGO is now approved by the National Science Board in the USA and supported by a significant capital contribution from PPARC in the UK and MPG in Germany).
b) third generation long baseline gravitational wave detectors, in particular the proposed Einstein Telescope in Europe, and towards
LISA the ESA/NASA space borne gravitational wave detector.

Prof. Hough is Fellow of the Royal Society of London (2003), the American Physical Society (2001), the Institute of Physics (1993) and the Royal Society of Edinburgh (1991). He received
Duddell Prize and Medal of the Institute of Physics in 2004 and Max Planck Research Prize in 2001.

It's our pleasure to present the 5 most important breakthroughs that Prof. Hough would like to see in the field of Particle Astrophysics.
-- 2Physics.com Team]

1) The direct detection of gravitational radiation
It is very important to make a direct detection to verify one of the few unproven predictions of Einstein's General Relativity and even more importantly to lead to the birth of a new astronomy. Gravitational wave astronomy will let us look into the hearts of some of the most violent events in the Universe.

2) The quantisation of Gravity
The challenge of developing a quantum theory of gravity and unifying gravity with the other fundamental forces in nature will undoubtedly lead to new discoveries about our Universe

3) The understanding of Dark Energy
Dark Energy - the mysterious reason for our Universe expanding anomalously - is not understood. Solving this enigma may help with understanding quantum gravity and will certainly give us a new perspective on fundamental interactions.

4) The successful launching of LISA, the space-borne gravitational wave detector
LISA will allow the study of the birth and interaction of massive black holes in the Universe in a way that cannot be achieved by any other mission.

5) The identification of dark matter
Observations suggest that there is much more matter in the Universe than we observe by standard means. Finding out the nature of the unseen 'dark' matter is a challenging problem for experimental physicists.

Labels: , , , ,


Wednesday, July 25, 2007

"Changing Constants, Dark Energy and the Absorption of 21 cm Radiation" -- By Ben Wandelt

Ben Wandelt [Photo credit: Department of Physics, University of Illinois/Thompson-McClellan Photography]

Rishi Khatri and Ben Wandelt have recently proposed a new technique for testing the constancy of the fine structure constant across cosmic time scales using what may prove to be the ultimate astronomical resource for fundamental physics. In this invited article, Ben Wandelt explains the motivation for this work and the physical origin of this treasure trove of information.

Author: Ben Wandelt
Affiliation: Center for Theoretical Astrophysics, University of Illinois at Urbana-Champaign

What makes Constants of Nature so special? From a theorist's perspective constants are necessary evils that ought to be overcome. The Standard Model has 19 “fundamental constants,” and that is ignoring the non-zero neutrino masses which bring the total count to a whopping 25. That's 25 numbers that need to be measured as input for the theory. A major motivating factor in the search of a fundamental theory beyond the Standard Model is to explain the values of these constants in terms of some underlying but as yet unseen structure.

What's more, not even the constancy of Constants of Nature (CoNs) is guaranteed (Uzan 2003). Maybe the quantities we think of as constants are actually dynamic but vary slowly enough that we haven't noticed. Or even if constant in time, these numbers may be varying across cosmic distances.

Quite contrary to taking the constancy of the CoNs for granted, one can argue that it is actually surprising. String theorists tell us these constants are related to the properties of the space spanned by the additional, small dimensions beyond our observed four (3 space + 1 time). These properties could well be dynamical (after all, we have known since Hubble that the 3 large dimensions are growing) so why aren't the 'constants' changing? This perspective places the onus on us to justify why the small sizes are at least approximately constant. So the modern, 21st century viewpoint is that it would be much more natural if the CoNs were not constant but varying—either spatially or with time.

By way of example consider the cosmic density of dark energy. The 20th century view was in terms of “vacuum energy,” a property of empty space that is predicted by the Standard Model of particle physics. This is qualitatively compelling, but quantitatively catastrophically wrong. More recently three main categories of attempts emerged to explain that particular constant (and ignore the vacuum energy problem). The first category explains dark energy as some new and exotic form of matter. The second category of explanations sees the acceleration of the Universe as evidence that our understanding of Gravity is incomplete.

The third argues that the dark energy density is just another CoN, the “cosmological constant,” which appears as an additive term in Einstein's equations of general relativity and therefore increases the rate of the expansion of the Universe. This possibility was originally suggested by Einstein himself. While this is quite an economical way of modeling all currently observed effects of the universal acceleration it is also hugely unsatisfactory as an actual explanation—somewhat analogous to a boss explaining the size of your salary as “Because that's the number and that's it.” The attempt to turn this into an actual explanation through the pseudo-anthropic reasoning associated with string-theoretic landscape arguments corresponds to your boss adding “Because if you wanted to earn any more than that you wouldn't be here to ask me this question.”

If we consider the cosmic density of dark energy as another CoN that appears in Einstein's equation, it should also somehow arise from the underlying fundamental theory, like the other constants. By the identical argument we went through before we should in fact be surprised by its constancy. Hence most of the theoretical activity takes place within categories one and two, endowing this supposedly constant CoN with dynamical properties that can in principle be tested by observation.

Of course none of these aesthetic or theoretical arguments for what constitutes a satisfying explanation holds any water if it cannot be tested. And in fact, there are two sorts of tests: laboratory tests and astronomical observations. For definiteness, let's focus the discussion on a particular CoN, the most accurately measured CoN, the fine structure constant α. This number tells us the strength of the force that will act on an electric charge when it is placed in an electromagnetic field. If you have heard about the charge of the electron you have already encountered this constant in a slightly different form. Since charge has units (Coulomb), one could always redefine the units to change the value. So the relevant number is a dimensionless combination of the charge of the electron with other CoNs. This gives α ≈ 1/137.

Over the years, the value of α has been measured in laboratory experiments to about 10 digits of accuracy. Using the extreme precision of atomic fountains, the value of α was measured over 5 years and found to have changed by less than 1 part in 1015 per year [Marion et al. 2003].

Laboratory experiments do have their distinct advantages: the setup is under complete control and repeatable. However, they suffer from the very short lever arm of human time scales. Astronomical observations provide a much longer lever arm in time. The best current observations use quasar absorption lines and limit the variation to a similar accuracy when put in terms of yearly variation—but these measurements constrain variation over the last 12 billion years, the time it took the Universe to expand by a factor of 2.

In fact, using such quasar data, one group has claimed a detection of a change in α of 0.001% over the last 12 billion years [Webb et al. 2001]—though this claim is certainly controversial [Chand et al. 2006], but things may become interesting at that level.

My graduate student Rishi Khatri and I have discovered a new astronomical probe of the fine structure constant that is likely the ultimate astronomical resource of information for probing its time variation. Compared to the quasar data our technique probes α at an even earlier epoch, only a few million years after the Big Bang, when the Universe went from 200 times smaller to 30 times smaller than it is today. And in principle, if some technological hurdles can be overcome, there is enough information to measure α to nine digits of accuracy 13.7 billion years in the past! This would be 10,000 more sensitive than the best laboratory measurements.

What is this treasure trove of information? It arrives at the Earth in the form of long wavelength radio waves between 6 meters and 42 meters long. Theses radio waves started out their lives between 0.5 cm and 3 cm long, as part of the cosmic microwave background that was emitted when the hot plasma of the early Universe transformed into neutral hydrogen gas. As the Universe expands, these waves stretch proportionally. After about 7 million years, the ones with the longest initial wavelength first stretch to a magic wavelength: 21 cm. At this wavelength these waves resonate with hydrogen atoms: they have just the right energy to be absorbed by its electron. Waves that are absorbed are removed from the cosmic microwave background and can be seen as an absorption line (similar to the well-known Fraunhofer lines in the solar spectrum). As the Universe expands during the next 120 million years, waves that were initially shorter stretch to 21 cm and are similarly absorbed by hydrogen. After this time, light from the first stars heat the hydrogen to the point that it can no longer absorb these waves [Loeb and Zaldarriaga 2004]

It turns out that the amount of absorption is extremely sensitive to the value of α. Therefore, the spectrum of absorption lines we expect to see in the radio waves is an accurate record of the value of α during this epoch. We could even look for variations of α within this epoch, and check for spatial variations of α and other 'constants.' I argued above that these variations are expected on general grounds, but they are also predicted by specific string-theory inspired models for dark energy such as the chameleon model.

The tests we propose are uniquely promising to constrain fundamental physics models with astronomical observations. Important technological hurdles have to be overcome to realize measurements of the radio wave spectrum at the required level of accuracy. Still, the next time you see snow on your analog TV you might consider that some of what you see is due to long wavelength radio waves have reached you from the early Universe, having traveled to you across the gulf of cosmic time and carry in them the signature that may reveal the fundamental theory of Nature.

References
Chand H. et al. 2006, Astron. Astrophys. 451, 45.
Khatri, R. and Wandelt, B. D. 2007, Physical Review Letters 98, 111301. Abstract
Loeb, A. and Zaldarriaga, M. 2004, Physical Review Letters 92, 211301. Abstract
Marion, H. et al. 2003, Phys.Rev.Lett. 90, 150801. Abstract
Uzan, J.-P. 2003, Reviews of Modern Physics, vol. 75, 403. Abstract
Webb,J. K. et al. 2001, Physical Review Letters 87, 091301. Abstract

Labels: , ,


Wednesday, January 17, 2007

Set-back for Dark Energy

Observational evidences suggest that the rate of expansion of the universe is increasing with time. This goes in contradiction to the expectation of some physicists that the finite energy of expansion would be continuously depleted by the gravitational attraction that holds the universe together. Some cosmologists tried to explain this increasing expansion with “dark energy” which may counteract the force of gravity at relatively short length scales – about 85 micrometres.

In order to explain the observed rate of expansion, dark energy must account for about 70% of all energy in the universe. But physicists still need a direct confirmation of its existence.

(photo of Dan Kapner, lead author of the paper; courtsey: the Eöt-Wash group )

In a recent paper in Physical Review Letters, a team of physicists from the Eöt-Wash group at the Center for Experimental Nuclear Physics and Astrophysics, University of Washington, Seattle reported their measurement of the force of gravity down to 55 micrometres and their conclusion that the inverse-square law remained valid well below 85 micrometres with 95% confidence. In a laboratory set-up, the scientists made very precise measurement of the gravitational attraction between two plates placed upon a torsion pendulum. Although a few other groups in various countries are engaged in such measurements, according to the Eöt-Wash researchers, their experiment offers the highest sensitivity at the length-scale associated with dark energy because it employs more interacting mass at the required separations than other setups.

Those who are familiar with such type of precision measurement will know that this puts a limit on the length-scale of any new type of interaction that can be theoretically predicted to exist. The experiment still does not rule out the existence of dark energy. But the potential implication of this experiment is very significant -- it's indeed a set-back for the theory of dark energy that could explain the increasing expansion of the universe.

Reference:
"Tests of the Gravitational Inverse-Square Law below the Dark-Energy Length Scale"
D. J. Kapner, T. S. Cook, E. G. Adelberger, J. H. Gundlach, B. R. Heckel, C. D. Hoyle, and H. E. Swanson,

Phys. Rev. Lett. 98, 021101 (8th January issue, 2007) Link to Abstract

Labels: ,


Sunday, March 12, 2006

Photon-Photon Scattering

Vacuum is 'a space absolutely devoid of matter'. But according to Quantum ElectroDynamics (QED), particles can still be created in this emptiness of vacuum through light-light interactions. This property follows directly from the quantum nature of the sub-atomic world, to be specific, from the Heisenberg Uncertainty Principle which states that the uncertainty in the position of a particle and the uncertainty of the momentum of a particle are related. A consequence of this principle is that even though there is nothing in the vacuum (no matter or radiation at all), there is still an uncertainty in the amount of energy which can be contained in the vacuum. On average, the energy is constant, however, there is always a slight uncertainty in the energy, which may allow a nonzero energy to exist for short intervals of time. Because of the equivalence between matter and energy, these small energy fluctuations can produce matter (particles) which exists for a short time and then disappears.

In a paper entitled "Using High-Power Lasers for Detection of Elastic Photon-Photon Scattering" published in March 3 issue of Physical Review Letters (Vol.96), Physicists from Umeå University, in Umeå, Sweden, and the Rutherford Appleton Lab, England, propose an experiment to explore the vacuum by aiming three powerful laser streams at each other in 3-dimensional space of the Laboratory (This is important because such proposals mooted earlier had the beams all in a single plane). These three beams will merge to produce a fourth stream with a wavelength shorter than any of the input beams.

The actual experiment is planned to be carried out over the next year at the Rutherford Appleton Lab near Didcot, England. By carefully polarizing the incoming light beams, the number of photons in the output beam can be controlled. This would be an important tool for investigating the parameter space of such a complex experiment, thus providing valuable information about the interactions that took place in the vacuum.

Besides providing good insight into QED itself, this experiment would also be used for testing theories that propose the existence of minor departures from Lorentz invariance which is an important proposition in special relativity that there is no preferred frame of reference. Light-light interactions may also be used to explore various hypotheses related to dark energy that is a hot topic of cosmology nowadays and may provide some clue about the rate and nature of the expansion of the universe.

Labels: , ,


Tuesday, December 13, 2005

Cosmological Constant & Dark Energy

Based on an ongoing study of exploding stars in the distant universe, astrophysicists have concluded that the effect of the "dark energy" that is speeding up the expansion of the universe is within 10 percent of that of Albert Einstein's celebrated cosmological constant. Cosmologists regard this result as a major step forward in understanding the nature of this mysterious property of the universe.

Reporting in an upcoming issue of the journal Astronomy and Astrophysics, an international team using a variety of instruments, including the 10-meter Keck telescopes in Hawaii, show the extent to which supernovae that erupt across the universe compare to those closer to home. Measuring the receding motion of supernovae at great distances has been intensely investigated since 1998, when researchers discovered that supernovae of a given recessional velocity seem to be fainter than they would be if the expansion of the universe were slowing down. This result, which has been observed consistently for the last eight years, strongly implies that the expansion rate of the universe is increasing.

The cause of this acceleration may be some form of exotic energy that causes space to push outwards. Einstein originally proposed a mathematical fudge-factor he called the cosmological constant that would preserve the notion of a universe with no beginning and no end. But when Edwin Hubble demonstrated that the universe was expanding, Einstein abandoned the cosmological constant as his "biggest blunder." The best way to study the dark energy, whatever it is, continues to be far away supernovae. Improved observations of distant supernovae are the most immediate way in present study is a very big step forward in quantity and quality and amazingly suggests that Einstein was pretty close to the mark.

The research project is known as the Supernova Legacy Survey (SNLS), which aims to discover and examine 700 distant supernovae to map out the history of the expansion of the universe. The survey confirms earlier discoveries that the expansion of the universe proceeded more slowly in the past and is speeding up today. However, the crucial step forward is the discovery that Einstein's 1917 explanation of a constant energy term for empty space fits the new supernova data very well.

Further Reading on Related Topics: 1) SuperNova Legacy Survey, 2) Another Report on this topic, 3) Dark Energy 4) Wikipedia Article on Dark Energy 5) NASA's Site on Supernovae

Labels: