In semi-supervised graph-based binary classifier learning, a subset of known labels are used to infer unknown labels, assuming that the label signal is smooth with respect to a similarity graph specified by a Laplacian matrix. When restricting labels to binary values, the problem is NP-hard. While a conventional semi-definite programming relaxation (SDR) can be solved in polynomial time using, for example, the alternating direction method of multipliers (ADMM), the complexity of projecting a candidate matrix onto the positive semi-definite (PSD) cone () per iteration remains high. In this paper, leveraging a recent linear algebraic theory called Gershgorin disc perfect alignment (GDPA), we propose a fast projection-free method by solving a sequence of linear programs (LP) instead. Specifically, we first recast the SDR to its dual, where a feasible solution is interpreted as a Laplacian matrix corresponding to a balanced signed graph minus the last node. To achieve graph balance, we split the last node into two, each retains the original positive / negative edges, resulting in a new Laplacian . We repose the SDR dual for solution , then replace the PSD cone constraint with linear constraints derived from GDPA -- sufficient conditions to ensure is PSD -- so that the optimization becomes an LP per iteration. Finally, we extract predicted labels from converged solution . Experiments show that our algorithm enjoyed a speedup over the next fastest scheme while achieving comparable label prediction performance.
View on arXiv