On the convergence of the extremal eigenvalues of empirical covariance matrices with dependence.

Authors
Publication date
2017
Publication type
Journal Article
Summary Consider a sample of a centered random vector with unit covariance matrix. We show that under certain regularity assumptions, and up to a natural scaling, the smallest and the largest eigenvalues of the empirical covariance matrix converge, when the dimension and the sample size both tend to infinity, to the left and right edges of the Marchenko--Pastur distribution. The assumptions are related to tails of norms of orthogonal projections. They cover isotropic log-concave random vectors as well as random vectors with i.i.d. coordinates with almost optimal moment conditions. The method is a refinement of the rank one update approach used by Srivastava and Vershynin to produce non-asymptotic quantitative estimates. In other words we provide a new proof of the Bai and Yin theorem using basic tools from probability theory and linear algebra, together with a new extension of this theorem to random matrices with dependent entries.
Publisher
Springer Science and Business Media LLC
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr