46
2

Projection-free Graph-based Classifier Learning using Gershgorin Disc Perfect Alignment

Wai-tian Tan
Guangtao Zhai
Abstract

In semi-supervised graph-based binary classifier learning, a subset of known labels x^i\hat{x}_i are used to infer unknown labels, assuming that the label signal xx is smooth with respect to a similarity graph specified by a Laplacian matrix. When restricting labels xix_i to binary values, the problem is NP-hard. While a conventional semi-definite programming (SDP) relaxation can be solved in polynomial time using, for example, the alternating direction method of multipliers (ADMM), the complexity of iteratively projecting a candidate matrix MM onto the positive semi-definite (PSD) cone (M0M \succeq 0) remains high. In this paper, leveraging a recent linear algebraic theory called Gershgorin disc perfect alignment (GDPA), we propose a fast projection-free method by solving a sequence of linear programs (LP) instead. Specifically, we first recast the SDP relaxation to its SDP dual, where a feasible solution H0H \succeq 0 can be interpreted as a Laplacian matrix corresponding to a balanced signed graph sans the last node. To achieve graph balance, we split the last node into two that respectively contain the original positive and negative edges, resulting in a new Laplacian Hˉ\bar{H}. We repose the SDP dual for solution Hˉ\bar{H}, then replace the PSD cone constraint Hˉ0\bar{H} \succeq 0 with linear constraints derived from GDPA -- sufficient conditions to ensure Hˉ\bar{H} is PSD -- so that the optimization becomes an LP per iteration. Finally, we extract predicted labels from our converged LP solution Hˉ\bar{H}. Experiments show that our algorithm enjoyed a 40×40\times speedup on average over the next fastest scheme while retaining comparable label prediction performance.

View on arXiv
Comments on this paper