11
11

K-Deep Simplex: Deep Manifold Learning via Local Dictionaries

Abstract

We propose K-Deep Simplex (KDS) which, given a set of data points, learns a dictionary comprising synthetic landmarks, along with representation coefficients supported on a simplex. KDS integrates manifold learning and sparse coding/dictionary learning: reconstruction term, as in classical dictionary learning, and a novel local weighted 1\ell_1 penalty that encourages each data point to represent itself as a convex combination of nearby landmarks. We solve the proposed optimization program using alternating minimization and design an efficient, interpretable autoencoder using algorithm enrolling. We theoretically analyze the proposed program by relating the weighted 1\ell_1 penalty in KDS to a weighted 0\ell_0 program. Assuming that the data are generated from a Delaunay triangulation, we prove the equivalence of the weighted 1\ell_1 and weighted 0\ell_0 programs. If the representation coefficients are given, we prove that the resulting dictionary is unique. Further, we show that low-dimensional representations can be efficiently obtained from the covariance of the coefficient matrix. We apply KDS to the unsupervised clustering problem and prove theoretical performance guarantees. Experiments show that the algorithm is highly efficient and performs competitively on synthetic and real data sets.

View on arXiv
Comments on this paper