Many data are naturally modeled by an unobserved hierarchical structure. In this paper we propose a flexible nonparametric prior over unknown data hierarchies. The approach uses nested stick-breaking processes to allow for trees of unbounded width and depth, where data can live at any node and are infinitely exchangeable. One can view our model as providing infinite mixtures where the components have a dependency structure corresponding to an evolutionary diffusion down a tree. By using a stick-breaking approach, we can apply Markov chain Monte Carlo methods based on slice sampling to perform Bayesian inference and simulate from the posterior distribution on trees. We apply our method to hierarchical clustering of images and topic modeling of text data.
@conference{adams2010tree, year = {2010}, author = {Adams, Ryan P. and Ghahramani, Zoubin and Jordan, Michael I.}, title = {Tree-Structured Stick Breaking for Hierarchical Data}, booktitle = {Advances in Neural Information Processing Systems (NIPS) 23}, note = {arXiv:1006.1062 [stat.ML]}, keywords = {Bayesian methods, Bayesian nonparametrics, Dirichlet processes, NIPS} }