ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19046
  4. Cited By
When Models Don't Collapse: On the Consistency of Iterative MLE

When Models Don't Collapse: On the Consistency of Iterative MLE

25 May 2025
Daniel Barzilai
Ohad Shamir
    SyDa
ArXivPDFHTML

Papers citing "When Models Don't Collapse: On the Consistency of Iterative MLE"

4 / 4 papers shown
Title
A Theoretical Perspective: How to Prevent Model Collapse in Self-consuming Training Loops
A Theoretical Perspective: How to Prevent Model Collapse in Self-consuming Training Loops
Shi Fu
Yingjie Wang
Yuzhu Chen
Xinmei Tian
Dacheng Tao
75
2
0
26 Feb 2025
Is Model Collapse Inevitable? Breaking the Curse of Recursion by
  Accumulating Real and Synthetic Data
Is Model Collapse Inevitable? Breaking the Curse of Recursion by Accumulating Real and Synthetic Data
Matthias Gerstgrasser
Rylan Schaeffer
Apratim Dey
Rafael Rafailov
Henry Sleight
...
Andrey Gromov
Daniel A. Roberts
Diyi Yang
D. Donoho
Oluwasanmi Koyejo
73
61
0
01 Apr 2024
Towards Theoretical Understandings of Self-Consuming Generative Models
Towards Theoretical Understandings of Self-Consuming Generative Models
Shi Fu
Sen Zhang
Yingjie Wang
Xinmei Tian
Dacheng Tao
66
11
0
19 Feb 2024
Nepotistically Trained Generative-AI Models Collapse
Nepotistically Trained Generative-AI Models Collapse
Matyáš Boháček
Hany Farid
72
19
0
20 Nov 2023
1