ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.08134
  4. Cited By
Dropping Convexity for More Efficient and Scalable Online Multiview
  Learning
v1v2v3v4v5v6v7v8v9v10 (latest)

Dropping Convexity for More Efficient and Scalable Online Multiview Learning

27 February 2017
Zhehui Chen
Lin F. Yang
C. J. Li
T. Zhao
ArXiv (abs)PDFHTML

Papers citing "Dropping Convexity for More Efficient and Scalable Online Multiview Learning"

3 / 3 papers shown
Title
Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via
  Thresholded Wirtinger Flow
Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow
T. Tony Cai
Xiaodong Li
Zongming Ma
129
233
0
10 Jun 2015
Escaping From Saddle Points --- Online Stochastic Gradient for Tensor
  Decomposition
Escaping From Saddle Points --- Online Stochastic Gradient for Tensor Decomposition
Rong Ge
Furong Huang
Chi Jin
Yang Yuan
143
1,059
0
06 Mar 2015
Phase Retrieval via Wirtinger Flow: Theory and Algorithms
Phase Retrieval via Wirtinger Flow: Theory and Algorithms
Emmanuel Candes
Xiaodong Li
Mahdi Soltanolkotabi
199
1,291
0
03 Jul 2014
1