Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1811.01640
Cited By
Leveraging Random Label Memorization for Unsupervised Pre-Training
5 November 2018
Vinaychandran Pondenkandath
Michele Alberti
Sammer Puran
Rolf Ingold
Marcus Liwicki
NoLa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Leveraging Random Label Memorization for Unsupervised Pre-Training"
6 / 6 papers shown
Title
Supervised Models Can Generalize Also When Trained on Random Labels
Oskar Allerbo
Thomas B. Schön
OOD
SSL
26
0
0
16 May 2025
To Each (Textual Sequence) Its Own: Improving Memorized-Data Unlearning in Large Language Models
George-Octavian Barbulescu
Peter Triantafillou
MU
38
16
0
06 May 2024
Unsupervised Learning of Initialization in Deep Neural Networks via Maximum Mean Discrepancy
Cheolhyoung Lee
Kyunghyun Cho
25
0
0
08 Feb 2023
The Curious Case of Benign Memorization
Sotiris Anagnostidis
Gregor Bachmann
Lorenzo Noci
Thomas Hofmann
AAML
54
8
0
25 Oct 2022
Memorization Without Overfitting: Analyzing the Training Dynamics of Large Language Models
Kushal Tirumala
Aram H. Markosyan
Luke Zettlemoyer
Armen Aghajanyan
TDI
31
187
0
22 May 2022
What Do Neural Networks Learn When Trained With Random Labels?
Hartmut Maennel
Ibrahim M. Alabdulmohsin
Ilya O. Tolstikhin
R. Baldock
Olivier Bousquet
Sylvain Gelly
Daniel Keysers
FedML
48
87
0
18 Jun 2020
1