ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.01140
56
0

DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows

3 March 2025
Jonathan Geuter
Clément Bonet
Anna Korba
David Alvarez-Melis
ArXivPDFHTML
Abstract

Deep Equilibrium Models (DEQs) are a class of implicit neural networks that solve for a fixed point of a neural network in their forward pass. Traditionally, DEQs take sequences as inputs, but have since been applied to a variety of data. In this work, we present Distributional Deep Equilibrium Models (DDEQs), extending DEQs to discrete measure inputs, such as sets or point clouds. We provide a theoretically grounded framework for DDEQs. Leveraging Wasserstein gradient flows, we show how the forward pass of the DEQ can be adapted to find fixed points of discrete measures under permutation-invariance, and derive adequate network architectures for DDEQs. In experiments, we show that they can compete with state-of-the-art models in tasks such as point cloud classification and point cloud completion, while being significantly more parameter-efficient.

View on arXiv
@article{geuter2025_2503.01140,
  title={ DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows },
  author={ Jonathan Geuter and Clément Bonet and Anna Korba and David Alvarez-Melis },
  journal={arXiv preprint arXiv:2503.01140},
  year={ 2025 }
}
Comments on this paper