XSEDE Science Successes

XSEDE Resources Help Confirm LIGO Discovery

Stampede, Comet Help Verify Finding

 

An artists impression of gravitational waves generated by binary neutron stars. Credits: R. Hurt/Caltech-JPL

Scientists have for the first time detected gravitational waves, ripples in the fabric of space-time hypothesized by Albert Einstein a century ago, in a landmark discovery announced on Feb. 11 that touts a new era in astrophysics.

Einstein in 1916 proposed the existence of gravitational waves as an outgrowth of his groundbreaking general theory of relativity, which depicted gravity as a distortion of space and time triggered by the presence of matter. But until now scientists had found only indirect evidence of their existence. Einstein thought that gravitational waves would be so weak that no one would ever be able to measure them.

But on September 14, 2015, scientists detected gravitational waves by both of the National Science Foundation (NSF) funded-twin Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors. LIGO is among the most precise measuring devices ever built — it's actually two observatories about 1,900 miles apart in Louisiana and Washington.

"This is opening a new window on astronomy and astrophysics," said Peter Couvares, the Advanced LIGO Data Analysis Computing Manager. "We're now seeing things in the universe that we've never seen before, and we have a medium for information that we've never had before. In addition to telescopes that provide electromagnetic observations, we now have gravitational wave observations and that's very powerful. The discovery was a verification of an incredibly important set of physics questions, but in service of astronomy and astrophysics."

Just how did these lasers detect the gravitational waves?

At each observatory, a laser beam is split in two and sent down a pair of tubes four kilometers long — the beams bounce off mirrors exactly the same distance away and reflect back. If a gravitational wave passes by it distorts space and changes the distance between the mirrors by 1/10,000th the diameter of a proton. If both LIGO observatories measure the same changes within 10 milliseconds of each other, then a gravitational wave is present.

A view of the LIGO detector in Hanford, Washington. LIGO research is carried out by the LIGO Scientific Collaboration, a group of more than 1000 scientists from universities around the United States and in 14 other countries. Credit: LIGO Laboratory

From these changes, scientists at LIGO could identify the wave's source and roughly where it came from in the universe. Amazingly, these waves were from the last fraction of a second of an event that scientists had predicted but never observed until now — the collision of two black holes that created one larger black hole. In this event, which occurred 1.3 billion years ago, the two black holes were 29 and 36 times the mass of our sun. At the moment of the collision, about three times the mass of the sun was converted into gravitational waves with a peak power output of about 50 times that of the whole visible universe.

XSEDE and Open Science Grid

For the resources and staff expertise, LIGO relied upon XSEDE (Extreme Science Engineering and Discovery Environment) and OSG (Open Science Grid). The OSG is now an XSEDE service provider, and resources at these two national facilities allow scientists to solve difficult computing problems.

"We used the OSG as a gateway to both Stampede and Comet," said Couvares. "We also leveraged resources at other institutions in the OSG, but Stampede and Comet were among the most used. For LIGO especially, OSG and XSEDE are highly complementary and becoming critical to our research efforts." 

While the production discovery analyses for the actual gravity wave detection last September were run on dedicated clusters in the LIGO Data Grid at multiple LIGO Scientific Collaboration institutions, XSEDE and OSG resources contributed processing to the analysis and review of the result.

Specifically, LIGO researchers consumed a total of almost four million hours via the OSG on both XSEDE and non-XSEDE resources, out of which 628,602 hours were on Comet, based at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, and 430,960 hours on Stampede, based at the Texas Advanced Computing Center (TACC) at The University of Texas at Austin, according to Frank Würthwein, OSG's executive director and lead for distributed high-throughput computing at SDSC. Via XSEDE allocations alone, LIGO researchers consumed millions of hours on both Comet and Stampede.

LIGO used Comet's new Virtual Cluster interface for the analysis of their present data. The total data set was five terabytes (TB), stored at the Holland Computing Center at University of Nebraska – Lincoln (UNL). Distributing the 5TB dataset from UNL to the many clusters on OSG, including Stampede and Comet, required a total transfer of one petabyte of data which was accomplished using GridFTP.

Stampede, one of the Top 10 supercomputers in the world, was chosen because it resembled a parallel high-throughput cluster more than any other XSEDE resource at the time, which is to say it contains a large number of Intel cores.

Couvares explained that LIGO's computing demands are very much in flux. "If nature is kind and we see a lot more events than we expected, we'd need a lot more computing power than anticipated. We wanted to have capabilities that went beyond our in-house computing resources, so XSEDE and OSG were an excellent option," he said.

Since LIGO scientists don't know what the waves look like before they detect them, they have to search over a large space of possible signals. "The gravitational waves depend sensitively on the black hole's masses and spins, making the search computationally challenging," said Duncan Brown, The Charles Brightman Professor of Physics at Syracuse University and a gravitational-wave astrophysicist who studies the mergers of binary black holes.

"Normally the computing workflows for LIGO are run on its own clusters and its partners," said Edgar M. Fajardo Hernandez, a programmer analyst in SDSC's high-throughput computing group. "OSG enabled scientists from the LIGO collaboration to access computer resources elastically – meaning increasing and decreasing on demand."

A technician works on one of LIGO's optics. At each observatory, the 2 1/2-mile long L-shaped LIGO interferometer uses laser light split into two beams that travel back and forth down the arms. The beams are used to monitor the distance between mirrors precisely positioned at the ends of the arms. According to Einstein's theory, the distance between the mirrors will change when a gravitational wave passed by the detector. Credit: LIGO Laboratory

LIGO's discovery of gravitational waves from the binary black hole required large-scale data analysis to validate the discovery claim. "This includes measuring how significant the signal is compared to noise in the detector, and re-analyzing the data with simulated signals to ensure that the search is working properly," said Brown. "Once we have made detections, extracting the physics from these waves requires careful modeling of the black holes." The waves that LIGO detected were produced by very strong gravitational fields. By comparing calculations of the detected waves with our observations, we can explore the nature of black holes and gravity itself."

"We have been calculating gravitational waveforms for the LIGO project by solving Einstein's equations numerically," said Saul Teukolsky, the Hans A. Bethe Professor of Astrophysics at Cornell University who has been using Comet for LIGO-related research. "We start with two black holes that may be spinning and put them in orbit around each other. We then evolve the system in time and read off the wave signal that is produced. In the discovery paper, the LIGO team showed a figure with their measured data. Superimposed on top was a theoretical waveform produced by our collaboration, which agreed very well. This is how we really know the waves were produced by two black holes and not something else.

Each simulation can take from about a week to one month to complete, depending on how complicated the orbit is, said Teukolsky. "The numerical method we use (spectral methods) is extremely efficient, but doesn't scale well with more than about 100 processors. However, we have to run cases with many different values of the black hole masses and spins in order to model all the possible waveforms, and Comet is ideal for this situation."

At LIGO, the bulk of data analysis requires large scale high-throughput computing with parallel workflows at the scale of tens of thousands of cores for long periods of time. LIGO leveraged the expertise at TACC through XSEDE to optimize their code, particularly for heterogeneous platforms.

"The staff at TACC was instrumental in helping us guide where we looked for optimization; ultimately, we ended up with a team of almost five full time LIGO people doing code optimization — the directions we chose were formed by our interactions with TACC staff and we wouldn't have been as successful without them," Couvares said.

Looking Toward the Future

"Nearly all of our work via XSEDE and its predecessor, TeraGrid, has been LIGO related," said Cornell's Teukolsky, whose allocations go back to 2005. "In addition to black holes, we also run binary neutron star cases, which LIGO hopes to detect soon."

LIGO's Couvares believes that XSEDE is at the forefront of defining what large scale computing and these national facilities will look like in the future, "and we want to be part of that conversation," he said. 

"LIGO is going to move into an era of regular detections where we're recording more than one event. At the same time, we'll be pushing the limits of physics, astrophysics, and engineering."

The LIGO observatories, XSEDE and OSG, and the Stampede and Comet supercomputers are funded by the National Science Foundation.