The oil industry has been one of the largest users of high performance computing for half a century. It deals with both large data volumes (100s TB) and algorithms with a high op count per sample. The algorithmic kernels range from sparse matrix to multi-dimensional Fourier transforms to finite difference, among many others.
In this talk I will discuss work by Stanford and others to tackle some of the industry's computational challenges. Tackling these problems involves not only knowledge of geosciences, but also some skill in applied math and computer science. The shift away from a lone option, single core, to a variety of different choices (multi/many-core, General Purpose Graphic Processing Units, and Field Programmable Gate Arrays) has meant that most approaches must be redesigned. I will show that by modifying the approach to fit the strengths of a given platform, significant performance gains can often be achieved.
There is no downloadable version of the slides for this talk available at this time.
About the speaker:
Bob Clapp is a Senior Research Engineer in Geophysics and Technical Director of the Center for Earth and Environmental Science at Stanford where he manages the School Earth of Science's Algorithm and Architecture Initiative. He received his Ph.D. from Stanford in Geophysics in 2001. He has authored and co-authored papers on seismic imaging, tomography, high performance computing, inversion and optimization.