Multi-Task Bayesian Optimization

Swersky, K., Snoek, J., & Adams, R. P. (2013). Multi-Task Bayesian Optimization. Advances in Neural Information Processing Systems (NIPS) 26.
Bayesian optimization has recently been proposed as a framework for automatically tuning the hyperparameters of machine learning models and has been shown to yield state-of-the-art performance with impressive ease and efficiency. In this paper, we explore whether it is possible to transfer the knowledge gained from previous optimizations to new tasks in order to find optimal hyperparameter settings more efficiently. Our approach is based on extending multi-task Gaussian processes to the framework of Bayesian optimization. We show that this method significantly speeds up the optimization process when compared to the standard single-task approach. We further propose a straightforward extension of our algorithm in order to jointly minimize the average error across multiple tasks and demonstrate how this can be used to greatly speed up k-fold cross-validation. Lastly, we propose an adaptation of a recently developed acquisition function, entropy search, to the cost-sensitive, multi-task setting. We demonstrate the utility of this new acquisition function by leveraging a small dataset to explore hyperparameter settings for a large dataset. Our algorithm dynamically chooses which dataset to query in order to yield the most information per unit cost.
  year = {2013},
  author = {Swersky, Kevin and Snoek, Jasper and Adams, Ryan P.},
  title = {Multi-Task {B}ayesian Optimization},
  booktitle = {Advances in Neural Information Processing Systems (NIPS) 26},
  keywords = {Gaussian processes, Bayesian optimization, Bayesian methods, NIPS}