Revisiting Frank-Wolfe for Structured Nonconvex Optimization
We introduce a new projection-free (Frank-Wolfe) method for optimizing structured nonconvex functions that are expressed as a difference of two convex functions. This problem class subsumes smooth nonconvex minimization, positioning our method as a promising alternative to the classical Frank-Wolfe algorithm. DC decompositions are not unique; by carefully selecting a decomposition, we can better exploit the problem structure, improve computational efficiency, and adapt to the underlying problem geometry to find better local solutions. We prove that the proposed method achieves a first-order stationary point in iterations, matching the complexity of the standard Frank-Wolfe algorithm for smooth nonconvex minimization in general. Specific decompositions can, for instance, yield a gradient-efficient variant that requires only calls to the gradient oracle. Finally, we present numerical experiments demonstrating the effectiveness of the proposed method compared to the standard Frank-Wolfe algorithm.
View on arXiv@article{maskan2025_2503.08921, title={ Revisiting Frank-Wolfe for Structured Nonconvex Optimization }, author={ Hoomaan Maskan and Yikun Hou and Suvrit Sra and Alp Yurtsever }, journal={arXiv preprint arXiv:2503.08921}, year={ 2025 } }