ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11006
14
0

Supervised Models Can Generalize Also When Trained on Random Label

16 May 2025
Oskar Allerbo
Thomas B. Schön
    OOD
    SSL
ArXivPDFHTML
Abstract

The success of unsupervised learning raises the question of whether also supervised models can be trained without using the information in the output yyy. In this paper, we demonstrate that this is indeed possible. The key step is to formulate the model as a smoother, i.e. on the form f^=Sy\hat{f}=Syf^​=Sy, and to construct the smoother matrix SSS independently of yyy, e.g. by training on random labels. We present a simple model selection criterion based on the distribution of the out-of-sample predictions and show that, in contrast to cross-validation, this criterion can be used also without access to yyy. We demonstrate on real and synthetic data that yyy-free trained versions of linear and kernel ridge regression, smoothing splines, and neural networks perform similarly to their standard, yyy-based, versions and, most importantly, significantly better than random guessing.

View on arXiv
@article{allerbo2025_2505.11006,
  title={ Supervised Models Can Generalize Also When Trained on Random Label },
  author={ Oskar Allerbo and Thomas B. Schön },
  journal={arXiv preprint arXiv:2505.11006},
  year={ 2025 }
}
Comments on this paper