ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.06853
10
0

Curvature Enhanced Data Augmentation for Regression

7 June 2025
Ilya Kaufman Sirot
Omri Azencot
ArXiv (abs)PDFHTML
Main:9 Pages
3 Figures
Bibliography:4 Pages
12 Tables
Appendix:11 Pages
Abstract

Deep learning models with a large number of parameters, often referred to as over-parameterized models, have achieved exceptional performance across various tasks. Despite concerns about overfitting, these models frequently generalize well to unseen data, thanks to effective regularization techniques, with data augmentation being among the most widely used. While data augmentation has shown great success in classification tasks using label-preserving transformations, its application in regression problems has received less attention. Recently, a novel \emph{manifold learning} approach for generating synthetic data was proposed, utilizing a first-order approximation of the data manifold. Building on this foundation, we present a theoretical framework and practical tools for approximating and sampling general data manifolds. Furthermore, we introduce the Curvature-Enhanced Manifold Sampling (CEMS) method for regression tasks. CEMS leverages a second-order representation of the data manifold to enable efficient sampling and reconstruction of new data points. Extensive evaluations across multiple datasets and comparisons with state-of-the-art methods demonstrate that CEMS delivers superior performance in both in-distribution and out-of-distribution scenarios, while introducing only minimal computational overhead. Code is available atthis https URL.

View on arXiv
@article{sirot2025_2506.06853,
  title={ Curvature Enhanced Data Augmentation for Regression },
  author={ Ilya Kaufman Sirot and Omri Azencot },
  journal={arXiv preprint arXiv:2506.06853},
  year={ 2025 }
}
Comments on this paper