|
Sample Complexity of Testing the Manifold Hypothesis
The hypothesis that high dimensional data tends to lie in the vicinity of a low dimensional manifold is the basis of a collection of methodologies termed Manifold Learning. In this paper, we study statistical aspects of the question of fitting a manifold with a nearly optimal least squared error. Given upper bounds on the dimension, volume, and curvature, we show that Empirical Risk Minimization can produce a nearly optimal manifold using a number of random samples that is it independent of the ambient dimension of the space in which data lie. We obtain an upper bound on the required number of samples that depends polynomially on the curvature, exponentially on the intrinsic dimension, and linearly on the intrinsic volume. For constant error, we prove a matching minimax lower bound on the sample complexity that shows that this dependence on intrinsic dimension, volume and curvature is unavoidable. Whether the known lower bound for the sample complexity of Empirical Risk minimization on k-means applied to data in a unit ball of arbitrary dimension is tight, has been an open question since 1997. Here eps is the desired bound on the error and de is a bound on the probability of failure. We improve the best currently known upper bound. Based on these results, we devise a simple algorithm for k-means and another that uses a family of convex programs to fit a piecewise linear curve of a specified length to high dimensional data, where the sample complexity is independent of the ambient dimension.
Video Length: 0
Date Found: March 26, 2011
Date Produced: March 25, 2011
View Count: 0
|