ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.00458
71
24
v1v2v3v4v5 (latest)

Improved Sample Complexity for Stochastic Compositional Variance Reduced Gradient

1 June 2018
Tianyi Lin
Chenyou Fan
Mengdi Wang
Michael I. Jordan
ArXiv (abs)PDFHTML
Abstract

Convex composition optimization is an emerging topic that covers a wide range of applications arising from stochastic optimal control, reinforcement learning and multi-stage stochastic programming. Existing algorithms suffer from unsatisfactory sample complexity and practical issues since they ignore the convexity structure in the algorithmic design. In this paper, we develop a new stochastic compositional variance-reduced gradient algorithm with the sample complexity of O((m+n)log⁡(1/ϵ)+1/ϵ3)O((m+n)\log(1/\epsilon)+1/\epsilon^3)O((m+n)log(1/ϵ)+1/ϵ3) where m+nm+nm+n is the total number of samples. Our algorithm is near-optimal as the dependence on m+nm+nm+n is optimal up to a logarithmic factor. Experimental results on real-world datasets demonstrate the effectiveness and efficiency of the new algorithm.

View on arXiv
Comments on this paper