ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.02108
25
9

Relative concentration bounds for the spectrum of kernel matrices

5 December 2018
Ernesto Araya Valdivia
ArXivPDFHTML
Abstract

In this paper we study the concentration properties for the eigenvalues of kernel matrices, which are central objects in a wide range of kernel methods and, more recently, in network analysis. We present a set of concentration inequalities tailored for each individual eigenvalue of the kernel matrix with respect to its known asymptotic limit. The inequalities presented here are of relative type, meaning that they scale with the eigenvalue in consideration, which results in convergence rates that vary across the spectrum. The rates we obtain here are faster than the typical \O(1n)\O(\frac{1}{\sqrt n})\O(n​1​) and are often exponential, depending on regularity assumptions of Sobolev type. One key feature of our results is that they apply to non positive kernels, which is fundamental in the context of network analysis. We show how our results are well suited for the study of dot product kernels, which are related to random geometric graphs on the sphere, via the graphon formalism. We illustrate our results by applying them to a variety of dot product kernels on the sphere and to the one dimensional Gaussian kernel.

View on arXiv
Comments on this paper