ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.07815
28
7

Simple ReFlow: Improved Techniques for Fast Flow Models

10 October 2024
Beomsu Kim
Yu-Guan Hsieh
Michal Klein
Marco Cuturi
Jong Chul Ye
Bahjat Kawar
James Thornton
    VLM
ArXivPDFHTML
Abstract

Diffusion and flow-matching models achieve remarkable generative performance but at the cost of many sampling steps, this slows inference and limits applicability to time-critical tasks. The ReFlow procedure can accelerate sampling by straightening generation trajectories. However, ReFlow is an iterative procedure, typically requiring training on simulated data, and results in reduced sample quality. To mitigate sample deterioration, we examine the design space of ReFlow and highlight potential pitfalls in prior heuristic practices. We then propose seven improvements for training dynamics, learning and inference, which are verified with thorough ablation studies on CIFAR10 32×3232 \times 3232×32, AFHQv2 64×6464 \times 6464×64, and FFHQ 64×6464 \times 6464×64. Combining all our techniques, we achieve state-of-the-art FID scores (without / with guidance, resp.) for fast generation via neural ODEs: 2.232.232.23 / 1.981.981.98 on CIFAR10, 2.302.302.30 / 1.911.911.91 on AFHQv2, 2.842.842.84 / 2.672.672.67 on FFHQ, and 3.493.493.49 / 1.741.741.74 on ImageNet-64, all with merely 999 neural function evaluations.

View on arXiv
Comments on this paper