ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.01640
  4. Cited By
Leveraging Random Label Memorization for Unsupervised Pre-Training

Leveraging Random Label Memorization for Unsupervised Pre-Training

5 November 2018
Vinaychandran Pondenkandath
Michele Alberti
Sammer Puran
Rolf Ingold
Marcus Liwicki
    NoLa
ArXivPDFHTML

Papers citing "Leveraging Random Label Memorization for Unsupervised Pre-Training"

6 / 6 papers shown
Title
Supervised Models Can Generalize Also When Trained on Random Labels
Supervised Models Can Generalize Also When Trained on Random Labels
Oskar Allerbo
Thomas B. Schön
OOD
SSL
29
0
0
16 May 2025
To Each (Textual Sequence) Its Own: Improving Memorized-Data Unlearning
  in Large Language Models
To Each (Textual Sequence) Its Own: Improving Memorized-Data Unlearning in Large Language Models
George-Octavian Barbulescu
Peter Triantafillou
MU
38
16
0
06 May 2024
Unsupervised Learning of Initialization in Deep Neural Networks via
  Maximum Mean Discrepancy
Unsupervised Learning of Initialization in Deep Neural Networks via Maximum Mean Discrepancy
Cheolhyoung Lee
Kyunghyun Cho
25
0
0
08 Feb 2023
The Curious Case of Benign Memorization
The Curious Case of Benign Memorization
Sotiris Anagnostidis
Gregor Bachmann
Lorenzo Noci
Thomas Hofmann
AAML
54
8
0
25 Oct 2022
Memorization Without Overfitting: Analyzing the Training Dynamics of
  Large Language Models
Memorization Without Overfitting: Analyzing the Training Dynamics of Large Language Models
Kushal Tirumala
Aram H. Markosyan
Luke Zettlemoyer
Armen Aghajanyan
TDI
31
187
0
22 May 2022
What Do Neural Networks Learn When Trained With Random Labels?
What Do Neural Networks Learn When Trained With Random Labels?
Hartmut Maennel
Ibrahim M. Alabdulmohsin
Ilya O. Tolstikhin
R. Baldock
Olivier Bousquet
Sylvain Gelly
Daniel Keysers
FedML
48
87
0
18 Jun 2020
1