Since its introduction in 2000, the locally linear embedding (LLE) has been widely applied in data science. We provide an asymptotical analysis of the LLE under the manifold setup. We show that for the general manifold, asymptotically we may not obtain the Laplace-Beltrami operator, and the result may depend on the non-uniform sampling, unless a correct regularization is chosen. We also derive the corresponding kernel function, which indicates that the LLE is not a Markov process. A comparison with the other commonly applied nonlinear algorithms, particularly the diffusion map, is provided, and its relationship with the locally linear regression is also discussed.
View on arXiv