Stationarity is often an unrealistic prior assumption for Gaussian process regression. One solution is to predefine an explicit nonstationary covariance function, but such covariance functions can be difficult to specify and require detailed prior knowledge of the nonstationarity. We propose the Gaussian process product model (GPPM) which models data as the pointwise product of two latent Gaussian processes to nonparametrically infer nonstationary variations of amplitude. This approach differs from other nonparametric approaches to covariance function inference in that it operates on the outputs rather than the inputs, resulting in a significant reduction in computational cost and required data for inference. We present an approximate inference scheme using Expectation Propagation. This variational approximation yields convenient GP hyperparameter selection and compact approximate predictive distributions.
@conference{adams2008gppm, year = {2008}, author = {Adams, Ryan P. and Stegle, Oliver}, title = {{G}aussian Process Product Models for Nonparametric Nonstationarity}, booktitle = {Proceedings of the 25th International Conference on Machine Learning (ICML)}, location = {Helsinki, Finland}, pages = {1--8}, keywords = {Gaussian processes, Bayesian nonparametrics, Bayesian methods, variational inference, ICML} }