17
0

Global Convergence of Adaptive Sensing for Principal Eigenvector Estimation

Abstract

This paper addresses the challenge of efficient principal component analysis (PCA) in high-dimensional spaces by analyzing a compressively sampled variant of Oja's algorithm with adaptive sensing. Traditional PCA methods incur substantial computational costs that scale poorly with data dimensionality, whereas subspace tracking algorithms like Oja's offer more efficient alternatives but typically require full-dimensional observations. We analyze a variant where, at each iteration, only two compressed measurements are taken: one in the direction of the current estimate and one in a random orthogonal direction. We prove that this adaptive sensing approach achieves global convergence in the presence of noise when tracking the leading eigenvector of a datastream with eigengap Δ=λ1λ2\Delta=\lambda_1-\lambda_2. Our theoretical analysis demonstrates that the algorithm experiences two phases: (1) a warmup phase requiring O(λ1λ2d2/Δ2)O(\lambda_1\lambda_2d^2/\Delta^2) iterations to achieve a constant-level alignment with the true eigenvector, followed by (2) a local convergence phase where the sine alignment error decays at a rate of O(λ1λ2d2/Δ2t)O(\lambda_1\lambda_2d^2/\Delta^2 t) for iterations tt. The guarantee aligns with existing minimax lower bounds with an added factor of dd due to the compressive sampling. This work provides the first convergence guarantees in adaptive sensing for subspace tracking with noise. Our proof technique is also considerably simpler than those in prior works. The results have important implications for applications where acquiring full-dimensional samples is challenging or costly.

View on arXiv
@article{saad-falcon2025_2505.10882,
  title={ Global Convergence of Adaptive Sensing for Principal Eigenvector Estimation },
  author={ Alex Saad-Falcon and Brighton Ancelin and Justin Romberg },
  journal={arXiv preprint arXiv:2505.10882},
  year={ 2025 }
}
Comments on this paper