ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.03822
16
4

Making Classical Machine Learning Pipelines Differentiable: A Neural Translation Approach

10 June 2019
Gyeong-In Yu
Saeed Amizadeh
Sehoon Kim
Artidoro Pagnoni
Byung-Gon Chun
Markus Weimer
Matteo Interlandi
    MedIm
    AI4CE
ArXivPDFHTML
Abstract

Classical Machine Learning (ML) pipelines often comprise of multiple ML models where models, within a pipeline, are trained in isolation. Conversely, when training neural network models, layers composing the neural models are simultaneously trained using backpropagation. We argue that the isolated training scheme of ML pipelines is sub-optimal, since it cannot jointly optimize multiple components. To this end, we propose a framework that translates a pre-trained ML pipeline into a neural network and fine-tunes the ML models within the pipeline jointly using backpropagation. Our experiments show that fine-tuning of the translated pipelines is a promising technique able to increase the final accuracy.

View on arXiv
Comments on this paper