*ArXiv Preprint ArXiv:1802.03451*.

Many important problems are characterized by the eigenvalues of a large matrix. For example, the difficulty of many optimization problems, such as those arising from the fitting of large models in statistics and machine learning, can be investigated via the spectrum of the Hessian of the empirical loss function. Network data can be understood via the eigenstructure of a graph Laplacian matrix using spectral graph theory. Quantum simulations and other many-body problems are often characterized via the eigenvalues of the solution space, as are various dynamic systems. However, naive eigenvalue estimation is computationally expensive even when the matrix can be represented; in many of these situations the matrix is so large as to only be available implicitly via products with vectors. Even worse, one may only have noisy estimates of such matrix vector products. In this work, we combine several different techniques for randomized estimation and show that it is possible to construct unbiased estimators to answer a broad class of questions about the spectra of such implicit matrices, even in the presence of noise. We validate these methods on large-scale problems in which graph theory and random matrix theory provide ground truth.

@article{adams2018estimating, year = {2018}, title = {Estimating the spectral density of large implicit matrices}, author = {Adams, Ryan P and Pennington, Jeffrey and Johnson, Matthew J and Smith, Jamie and Ovadia, Yaniv and Patton, Brian and Saunderson, James}, journal = {arXiv preprint arXiv:1802.03451} }