|
Convex Relaxation and Estimation of High-Dimensional Matrices
Problems that require estimating high-dimensional matrices from noisy observations arise frequently in statistics and machine learning.  Examples include dimensionality reduction methods (e.g., principal components and canonical correlation), collaborative filtering and matrix completion (e.g., Netflix and other recommender systems), multivariate regression, estimation of time-series models, and graphical model learning. When the sample size is less than the matrix dimensions, all of these problems are ill-posed, so that some type of structure is required in order to obtain interesting results. In recent years, relaxations based on the nuclear norm and other types of convex matrix regularizers have become popular. By framing a broad class of problems as special cases of matrix regression, we present a single theoretical result that provides guarantees on the accuracy of such convex relaxations. Our general result can be specialized to obtain various non-asymptotic bounds, among them sharp rates for noisy forms of matrix completion, matrix compression, and matrix decomposition. In all of these cases, information-theoretic methods can be used to show that our rates are minimax-optimal, and thus cannot be substantially improved upon by any algorithm, regardless of computational complexity.
Video Length: 0
Date Found: May 06, 2011
Date Produced: May 06, 2011
View Count: 1
|