ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.13848
22
0

High-Dimensional PCA Revisited: Insights from General Spiked Models and Data Normalization Effects

25 August 2024
Yanqing Yin
Wang Zhou
ArXivPDFHTML
Abstract

Principal Component Analysis (PCA) is a critical tool for dimensionality reduction and data analysis. This paper revisits PCA through the lens of generalized spiked covariance and correlation models, which allow for more realistic and complex data structures. We explore the asymptotic properties of the sample principal components (PCs) derived from both the sample covariance and correlation matrices, focusing on how data normalization-an essential step for scale-invariant analysis-affects these properties. Our results reveal that while normalization does not alter the first-order limits of spiked eigenvalues and eigenvectors, it significantly influences their second-order behavior. We establish new theoretical findings, including a joint central limit theorem for bilinear forms of the sample covariance matrix's resolvent and diagonal entries, providing a robust framework for understanding spiked models in high dimensions. Our theoretical results also reveal an intriguing phenomenon regarding the effect of data normalization when the variances of covariates are equal. Specifically, they suggest that high-dimensional PCA based on the correlation matrix may not only perform comparably to, but potentially even outperform, PCA based on the covariance matrix-particularly when the leading principal component is sufficiently large. This study not only extends the existing literature on spiked models but also offers practical guidance for applying PCA in real-world scenarios, particularly when dealing with normalized data.

View on arXiv
Comments on this paper