Supercomputers in the U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC), located at the Lawrence Berkeley National Laboratory (LBNL) in California, have been harnessed to analyze the exploding volume of data produced by the European Space Agency’s Planck satellite, which observes the cosmic microwave background (CMB) radiation, a remnant of the big bang.

The Planck data analysis project was granted an unprecedented multi-year allocation of computer time on the NERSC supercomputers — tens of millions of CPU-hours, plus correspondingly large data storage and data transfer resources.

To date, the increasingly accurate measurements of the CMB radiation has further confirmed the big bang theory of cosmology, verified the geometric flatness of the universe and pointed to inflation as source of the first perturbations.

Julian Borrill, a researcher in the Computational Research Division at the Berkeley Lab, explains the computational requirements in these terms:

The sheer volume of the Planck data, with about a trillion observations of a billion points on the sky, means that the techniques of exact analysis we used in the past for the data from balloon flights are no longer tractable. … Instead we have to use approximate methods, and because they’re approximate, we have to worry about their possible uncertainties and biases.

In order to validate their experimental data processing, Borrill and his colleagues compare their results with results of large-scale Monte Carlo simulations. These simulations also require vast amounts of computing time, and are typically run on NERSC’s “Hopper” system, which has 150,000 processor cores.

The NERSC center is now in the process of installing the “Edison” system, a Cray XC30 supercomputer, which will feature a theoretical peak performance rate of two petaflops, i.e., two quadrillion (or 2 x 10^{15}) 15-digit floating-point arithmetic operations per second, nearly double the rate of the Hopper system.

These researchers will need the extra computing power. The upcoming POLARBEAR and SPTpol ground-based telescopes, the EBEX and Spider Antarctic balloon flights, and a proposed Inflation Probe satellite, will require even more computation than they have used to date. The first phase of POLARBEAR, for example, will gather 10 times as much data as Planck, and, when later expanded, will gather 100 times as much as Planck. The Inflation Probe mission will gather 1000 times as much as Planck. Processing this data in a reasonable time will strain supercomputers as never before.

For additional details, see this LBNL press report.