Given a convex and differentiable objective for a real symmetric matrix in the positive definite (PD) cone -- used to compute Mahalanobis distances -- we propose a fast general metric learning framework that is entirely projection-free. We first assume that resides in a space of generalized graph Laplacian matrices corresponding to balanced signed graphs. that is also PD is called a graph metric matrix. Unlike low-rank metric matrices common in the literature, includes the important diagonal-only matrices as a special case. The key theorem to circumvent full eigen-decomposition and enable fast metric matrix optimization is Gershgorin disc perfect alignment (GDPA): given and diagonal matrix , where and \v is 's first eigenvector, we prove that Gershgorin disc left-ends of similarity transform are perfectly aligned at the smallest eigenvalue . Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / off-diagonal terms in can be solved efficiently as linear programs via the Frank-Wolfe method. We update \v using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as entries in are optimized successively. Experiments show that our graph metric optimization is significantly faster than cone-projection schemes, and produces competitive binary classification performance.
View on arXiv