Convergence of Graph Laplacian with kNN Self-tuned Kernels

Kernelized Gram matrix constructed from data points as is widely used in graph-based geometric data analysis and unsupervised learning. An important question is how to choose the kernel bandwidth , and a common practice called self-tuned kernel adaptively sets a at each point by the -nearest neighbor (kNN) distance. When 's are sampled from a -dimensional manifold embedded in a possibly high-dimensional space, unlike with fixed-bandwidth kernels, theoretical results of graph Laplacian convergence with self-tuned kernels have been incomplete. This paper proves the convergence of graph Laplacian operator to manifold (weighted-)Laplacian for a new family of kNN self-tuned kernels , where is the estimated bandwidth function {by kNN}, and the limiting operator is also parametrized by . When , the limiting operator is the weighted manifold Laplacian . Specifically, we prove the point-wise convergence of and convergence of the graph Dirichlet form with rates. Our analysis is based on first establishing a consistency for which bounds the relative estimation error uniformly with high probability, where , and is the data density function. Our theoretical results reveal the advantage of self-tuned kernel over fixed-bandwidth kernel via smaller variance error in low-density regions. In the algorithm, no prior knowledge of or data density is needed. The theoretical results are supported by numerical experiments on simulated data and hand-written digit image data.
View on arXiv