ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.03512
  4. Cited By
Distillation of Diffusion Features for Semantic Correspondence

Distillation of Diffusion Features for Semantic Correspondence

4 December 2024
Frank Fundel
Johannes Schusterbauer
Vincent Tao Hu
Bjorn Ommer
    DiffM
ArXiv (abs)PDFHTML

Papers citing "Distillation of Diffusion Features for Semantic Correspondence"

5 / 5 papers shown
Title
Do It Yourself: Learning Semantic Correspondence from Pseudo-Labels
Do It Yourself: Learning Semantic Correspondence from Pseudo-Labels
Olaf Dünkel
Thomas Wimmer
Christian Theobalt
Christian Rupprecht
Adam Kortylewski
3DPC
131
0
0
05 Jun 2025
Diff2Flow: Training Flow Matching Models via Diffusion Model Alignment
Diff2Flow: Training Flow Matching Models via Diffusion Model Alignment
Johannes Schusterbauer
Ming Gui
Frank Fundel
Bjorn Ommer
43
0
0
02 Jun 2025
Semantic Correspondence: Unified Benchmarking and a Strong Baseline
Semantic Correspondence: Unified Benchmarking and a Strong Baseline
Kaiyan Zhang
Xinghui Li
Jingyi Lu
Kai Han
3DV
106
1
0
23 May 2025
Latent Diffusion U-Net Representations Contain Positional Embeddings and Anomalies
Latent Diffusion U-Net Representations Contain Positional Embeddings and Anomalies
Jonas Loos
Lorenz Linhardt
79
0
0
09 Apr 2025
[MASK] is All You Need
[MASK] is All You Need
Vincent Tao Hu
Bjorn Ommer
DiffM
221
5
0
09 Dec 2024
1