ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.04609
37
0

Scalable Approximate Algorithms for Optimal Transport Linear Models

6 April 2025
Tomasz Kacprzak
Francois Kamper
Michael W. Heiss
Gianluca Janka
Ann M. Dillner
Satoshi Takahama
    OT
ArXivPDFHTML
Abstract

Recently, linear regression models incorporating an optimal transport (OT) loss have been explored for applications such as supervised unmixing of spectra, music transcription, and mass spectrometry. However, these task-specific approaches often do not generalize readily to a broader class of linear models. In this work, we propose a novel algorithmic framework for solving a general class of non-negative linear regression models with an entropy-regularized OT datafit term, based on Sinkhorn-like scaling iterations. Our framework accommodates convex penalty functions on the weights (e.g. squared-ℓ2\ell_2ℓ2​ and ℓ1\ell_1ℓ1​ norms), and admits additional convex loss terms between the transported marginal and target distribution (e.g. squared error or total variation). We derive simple multiplicative updates for common penalty and datafit terms. This method is suitable for large-scale problems due to its simplicity of implementation and straightforward parallelization.

View on arXiv
@article{kacprzak2025_2504.04609,
  title={ Scalable Approximate Algorithms for Optimal Transport Linear Models },
  author={ Tomasz Kacprzak and Francois Kamper and Michael W. Heiss and Gianluca Janka and Ann M. Dillner and Satoshi Takahama },
  journal={arXiv preprint arXiv:2504.04609},
  year={ 2025 }
}
Comments on this paper