ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.11584
12
11

Scalable Semi-Supervised SVM via Triply Stochastic Gradients

26 July 2019
Xiang Geng
Bin Gu
Xiang Li
Wanli Shi
Guansheng Zheng
Heng-Chiao Huang
ArXivPDFHTML
Abstract

Semi-supervised learning (SSL) plays an increasingly important role in the big data era because a large number of unlabeled samples can be used effectively to improve the performance of the classifier. Semi-supervised support vector machine (S3^33VM) is one of the most appealing methods for SSL, but scaling up S3^33VM for kernel learning is still an open problem. Recently, a doubly stochastic gradient (DSG) algorithm has been proposed to achieve efficient and scalable training for kernel methods. However, the algorithm and theoretical analysis of DSG are developed based on the convexity assumption which makes them incompetent for non-convex problems such as S3^33VM. To address this problem, in this paper, we propose a triply stochastic gradient algorithm for S3^33VM, called TSGS3^33VM. Specifically, to handle two types of data instances involved in S3^33VM, TSGS3^33VM samples a labeled instance and an unlabeled instance as well with the random features in each iteration to compute a triply stochastic gradient. We use the approximated gradient to update the solution. More importantly, we establish new theoretic analysis for TSGS3^33VM which guarantees that TSGS3^33VM can converge to a stationary point. Extensive experimental results on a variety of datasets demonstrate that TSGS3^33VM is much more efficient and scalable than existing S3^33VM algorithms.

View on arXiv
Comments on this paper