Systems | Information | Learning | Optimization
 

SILO: Bottom-up manifold learning with low distortion

Title: Bottom-up manifold learning with low distortion

Abstract: In this talk I will present Low Distortion Local Eigenmaps (LDLE), a “bottom-up” manifold learning framework which constructs a set of low distortion local views of a dataset in lower dimensions and registers them to obtain a global embedding. The local views are constructed by selecting subsets of the global eigenvectors of the graph Laplacian such that they are locally orthogonal and form a local near-orthogonal basis for the data. The global embedding is obtained by rigid alignment of the local views, which is solved iteratively to enable embedding of manifolds into their intrinsic dimension by “tearing them apart”, including manifolds without boundary and non-orientable manifolds. We also define a new measure, global distortion, to evaluate embeddings in low dimensions. We show that Riemannian Gradient Descent (RGD) converges to an embedding with guaranteed low global distortion. Compared to competing manifold learning and data visualization approaches, we demonstrate LDLE achieves lowest local and global distortion on real and synthetic datasets.

March 22 @ 12:30
12:30 pm (1h)

Orchard View Room, Virtual

Gal Mishne

Video