ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.03722
27
22

Optimal transport map estimation in general function spaces

7 December 2022
Vincent Divol
Jonathan Niles-Weed
Aram-Alexandre Pooladian
    OT
ArXivPDFHTML
Abstract

We study the problem of estimating a function TTT given independent samples from a distribution PPP and from the pushforward distribution T♯PT_\sharp PT♯​P. This setting is motivated by applications in the sciences, where TTT represents the evolution of a physical system over time, and in machine learning, where, for example, TTT may represent a transformation learned by a deep neural network trained for a generative modeling task. To ensure identifiability, we assume that T=∇φ0T = \nabla \varphi_0T=∇φ0​ is the gradient of a convex function, in which case TTT is known as an \emph{optimal transport map}. Prior work has studied the estimation of TTT under the assumption that it lies in a H\"older class, but general theory is lacking. We present a unified methodology for obtaining rates of estimation of optimal transport maps in general function spaces. Our assumptions are significantly weaker than those appearing in the literature: we require only that the source measure PPP satisfy a Poincar\é inequality and that the optimal map be the gradient of a smooth convex function that lies in a space whose metric entropy can be controlled. As a special case, we recover known estimation rates for H\"older transport maps, but also obtain nearly sharp results in many settings not covered by prior work. For example, we provide the first statistical rates of estimation when PPP is the normal distribution and the transport map is given by an infinite-width shallow neural network.

View on arXiv
Comments on this paper