Peter Qian

Some Statistical Aspects of Uncertainty Quantification

Computer simulations based on computational fluid dynamics, finite element analysis, discrete element models and multi-physics codes are widely used in aerospace, turbomachine, finance, data centers and many other industries. These simulations are necessary for studying complex phenomena such as thermal dynamics, supersonic flows, aircraft-controller interaction, and engine systems. Unfortunately, simulation models are never perfect and various uncertainties, including random initial and boundary conditions, input uncertainty, and model discrepancy, can produce misleading results. Thus, it is necessary to develop a rigorous mathematical framework for uncertainty quantification (UQ). Driven by this end, engineers in the industry are taking greater steps towards implementing probabilistic design processes.

This talk will discuss three statistical aspects of the emerging field of UQ.  First, several new classes of scalable design of experiments methods are demonstrated. These methods were developed to efficiently run complex simulations in parallel and achieve variance reduction from both stratification and controlling correlation.  Second, a new theoretical framework is introduced to shed light on the statistical and numeric trade-off of gradient-enhanced Kriging emulators. The framework is extended into an iterative procedure, called iKriging, for building large-scale emulators to balance between statistical and numeric accuracy.  Third, a new statistical method is introduced to efficiently solve optimization under uncertainty problems. The key of the method is to embed negative dependence among multiple batches in sample average approximations to achieve variance reduction.