GPU-Based Method for Detecting Earthquakes Being Scaled for Big Data

Fri, 04/19/2013 - 23:00 | Atlanta, GA

Related Media

Click on image(s) to view larger version(s)

For More Information Contact

Joshua Preston


The development of a next-generation computing code for massive-scale detection of seismic earthquake signals is continuing with the support of a second-year seed grant through the Institute for Data and High Performance Computing.

Principal investigators Zhigang Peng, associate professor in the School of Earth and Atmospheric Sciences, and Bo Hong, assistant professor in the School of Electrical and Computer Engineering, have successfully developed a GPU-based method that can significantly accelerate the detection of earthquake signals from continuous data in a relatively small-scale space-time window (i.e. months of data recorded at several close-proximity seismic stations).

The researchers are now analyzing much larger data sets that include several years of data recorded at hundreds of seismic stations. The datasets will allow the researchers to address the challenges of scaling the current seismic-detection code to much larger real-world data and to significantly improve scientific understanding of the physics of earthquakes. The long-term goal is in developing scalable methods for seismic data analysis in the context of Big Data challenges.

“So far we have identified approximately 70 times more earthquakes around the Salton Sea geothermal field than listed in the official Southern California Seismic Network catalog,” says Peng. “These newly detected events could be used to help better understand how earthquakes are triggered in the immediate vicinity of a mainshock rupture.”

The researchers are analyzing existing recorded seismic data and taking these earlier earthquakes’ waveforms to use as templates to find “hidden” seismic activity. They automatically scan through continuous recordings from seismic stations to detect previously undetected earthquakes that have high waveform similarities to the template events.

“This is especially useful when many earthquakes occur in a short time, such as during an aftershock sequence or earthquake swarm,” said Peng. “These newly detected events are not only vital for better understanding the fundamental physics of earthquake interaction, but also useful for rapid earthquake source characterization and seismic hazard forecasting.”

The computation demands of the target problem - identifying the previously undetected earthquake signals in years of continuous data - can only be satisfied by a GPU cluster, such as Georgia Tech’s Keeneland system, due to the scale of the problem according to the researchers.

“During this new seed grant period, we plan to address the code scalability issues through innovations in code engineering and workload management,” says Hong.

The seismic detection code will be available to the scientific community and researchers hoping to generalize an understanding of the data processing challenges in the research space. This will allow researchers to extend the results to a broader range of data-intensive seismic problems such as automatic scanning and detection of new seismic events, including tremors, regular earthquakes, and glacial events.

IDH funding supports graduate research assistants Xiaofeng Meng, a Ph.D. student in Earth and Atmospheric Sciences, and Xiao Yu, a Ph.D. student in Electrical and Computer Engineering.