Printable PDF
Department of Mathematics,
University of California San Diego

****************************

CSME/Center for Computational Mathematics Seminar

Manuel Tiglio

CSME, University of Maryland, and Caltech

Reduced Order Modeling in General Relativity

Abstract:

General Relativity (GR) suffers, in several ways -- from source modeling to data analysis--, from the "curse of dimensionality", by which it is here roughly meant that the complexity of the system grows beyond practical control as more physical parameters of interest are taken into account. This is a very concerning, practical bottleneck for the upcoming generation of advanced gravitational wave detectors, worth billions of dollars, which is expected to detect within a few years gravitational waves in a direct way for the first time in history and reach unexplored portions of the universe. This is the most anticipated era of general relativity since the study of Hulse and Taylor which lead to their Nobel prize in 1993. Due to the low signal-to-noise ratio of any expected detection, matched filtering and catalogs of templates are needed, both for detection and parameter estimation of the source of any trigger. Numerical relativity simulations of the Einstein equations typically take hundreds of thousands of hours, making a survey of the full parameter space intractable with standard search methods. Parameter estimation algorithms are just impractical, would take years of computing time. I will first summarize previous work I have involved in and the state-of-the-art of Einstein's equations as an initial-boundary value problem, both at the continuum and discrete level. Next I will discuss my current research program, in collaboration with many colleagues, to tackle the curse of dimensionality in GR, existing results, and plans for the future. The effort is essentially about dealing with parametrized systems using reduced order models (ROM), and the techniques are generic and applicable to many areas. In our driving field, general relativity, we typically obtain several (from 3 to 11) orders of magnitude of computational speedup compared to direct approaches, both on the modeling and data analysis sides. As an example, we can now substitute a typical months-long supercomputer simulation of colliding black holes with a surrogate model that can run on a smartphone within a tenth of a millisecond without loss of accuracy. Real time calculations on mobile devices not only provide a huge opportunity of outreach about complicated systems such as colliding black holes, but also application-specific ones in many areas, such as on-site or remote design and control. The effort is a combination of theoretical physics, analytical and numerical methods for partial differential equations, scientific computing, large scale computations, reduced order modeling, approximation theory, sparse representations and signal processing.

Host: Michael Holst

October 31, 2013

11:00 AM

AP&M 2402

****************************