Abstract: Over the last few years, high performance computing has seen a dramatic shift towards Graphics Processing Units (GPUs). These used to refer to graphics cards used for scientific computations. Graphics cards are massively parallel devices, with low cost per floating point operation per second because of being driven by the huge market for video games. Now they refer to specific devices based on such architectures but with specifications needed by high performance computing scientists.
The binary black hole problem in Einstein's General Relativity theory suffers from the curse of dimensionality: there are too many solutions to explore and any numerical simulation typically takes hundreds of thousands of supercomputer hours. On the other hand, an exhaustive exploration of the parameter space is crucial for the upcoming generation of advanced earth-based gravitational wave interferometer detectors, a very ambitious project that has dominated funding decisions world-wide for a few decades now.
I will describe results combining these apparently separate fields. Namely, numerical studies of binary black holes using GPUs which revealed that for all practical purposes the space of solutions has a lower dimension than the space of parameters describing it. If time allows, I will also briefly mention current, followup work aimed at beating the curse of dimensionality in gravitational waves. The techniques are generic and applicable to areas such as image compression, gene classification, hyper-spectral images, and more.
The original work, which will be the main topic of this talk, was done in collaboration with Chad Galley (Caltech), Frank Herrmann (UMD) and John Silberholz (MIT).