Kernel Methods for Nonparametric Bayesian Inference of Probability Densities and Point Processes

Adams, R. P. (2009). Kernel Methods for Nonparametric Bayesian Inference of Probability Densities and Point Processes [PhD thesis]. University of Cambridge.
Nonparametric kernel methods for estimation of probability densities and point process intensities have long been of interest to researchers in statistics and machine learning. Frequentist kernel methods are widely used, but provide only a point estimate of the unknown density. Additionally, in frequentist kernel density methods, it can be difficult to select appropriate kernel parameters. The Bayesian approach to inference potentially resolves both of these deficiencies, by providing a distribution over the unknowns and enabling a principled approach to kernel selection. Constructing a Bayesian nonparametric kernel density method has proven to be difficult, however, due to the need to integrate over an infinite-dimensional random function in order to evaluate the likelihood. To avoid this intractability, all Bayesian kernel density methods to date have either used a crippled model or a finite-dimensional approximation. Recent advances in Markov chain Monte Carlo methods have improved the situation for these doubly-intractable posterior distributions, however. If data can be generated exactly from the model, then it is possible to perform inference without computing the intractable likelihood. I propose two new kernel-based models that enable an exact generative procedure: the Gaussian process density sampler (GPDS) for probability density functions, and the sigmoidal Gaussian Cox process (SGCP) for the Poisson process. With generative priors, I show how it is now possible to construct two dif- ferent kinds of Markov chains for inference in these models. These Markov chains have the desired posterior distribution as their equilibrium distributions, and, despite a parameter space with uncountably many dimensions, require only a finite amount of computation to simulate. The GPDS and SGCP, and the associated inference procedures, are the first kernel-based nonparametric Bayesian methods that allow inference without a finite-dimensional approximation. I also present several additional kernel-based models for data that extend the Gaussian process density sampler and sigmoidal Gaussian Cox process to other situations. The Archipelago model extends the GPDS to address the task of semi-supervised learning, where a flexible density estimate can improve the performance of a classifier when unlabeled data are available. I also generalise the SGCP to enable a nonparametric inhomogeneous Neyman–Scott process, and present a soft-core generalisation of the Mate ́rn repulsive process that similarly allows non-approximate inference via Markov chain Monte Carlo.
  @phdthesis{adams2009thesis,
  year = {2009},
  author = {Adams, Ryan P.},
  title = {Kernel Methods for Nonparametric Bayesian Inference of Probability Densities and Point Processes},
  month = oct,
  school = {University of Cambridge},
  address = {Cambridge, UK},
  keywords = {Gaussian processes, Bayesian nonparametrics, Bayesian methods, Markov chain Monte Carlo}
}