Objective-Sensitive Principal Component Analysis for High-Dimensional Inverse Problems

We present a novel approach for adaptive, differentiable parameterization of large-scale random fields. If the approach is coupled with any gradient-based optimization algorithm, it can be applied to a variety of optimization problems, including history matching. The developed technique is based on principal component analysis (PCA) but modifies a purely data-driven basis of principal components considering objective function behavior. To define an efficient encoding, Gradient-Sensitive PCA uses an objective function gradient with respect to model parameters. We propose computationally efficient implementations of the technique, and two of them are based on stationary perturbation theory (SPT). Optimality, correctness, and low computational costs of the new encoding approach are tested, verified, and discussed. Three algorithms for optimal parameter decomposition are presented and applied to an objective of 2D synthetic history matching. The results demonstrate improvements in encoding quality regarding objective function minimization and distributional patterns of the desired field. Possible applications and extensions are proposed.
View on arXiv