In scientific fields ranging from geophysics and atmospheric science to medical imaging and network communication, data are being generated at remarkable rates. Such data are typically indirectly related to quantities of interest and the data sets are in many cases dynamically growing. Extracting desired information from these data then requires the solution of very large data-intensive inverse problems, perhaps repeatedly and in real time. The computational challenges of obtaining such a solution are compounded by the demands of validation and uncertainty analysis, which can easily become computationally prohibitive. This project will develop mathematical/statistical methods and computational tools for the solution of data-intensive inverse problems. The core of this approach is a stochastic reformulation of such problems that aims to significantly reduce the computational costs while adapting to modern hardware architectures. A framework will be developed to address the challenges arising at the interface between big data, inverse problems, data analysis, and uncertainty quantification. First, randomized methods for the solution of linear and nonlinear inverse problems will be introduced, so that efficient stochastic optimization methods can be used to overcome the hardware limitations of current algorithms and to generate solutions and uncertainty assessments in near-real time. New theory and scalable methods will be developed within the stochastic framework, thereby ensuring solution accuracy, reliability, and robustness. Second, advanced tools will be developed for model validation, error analysis, and uncertainty quantification. By partnering with application scientists (e.g., in atmospheric remote sensing), methods developed in this project will be of immediate practical utility for scientists and engineers.
Division Of Mathematical Sciences (DMS)