ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.18772
39
1

Learning production functions for supply chains with graph neural networks

26 July 2024
Serina Chang
Zhiyin Lin
Benjamin Yan
Swapnil Bembde
Qi Xiu
Chi Heem Wong
Yu Qin
Frank Kloster
Alex Luo
Raj Palleti
J. Leskovec
    GNN
    AI4TS
ArXivPDFHTML
Abstract

The global economy relies on the flow of goods over supply chain networks, with nodes as firms and edges as transactions between firms. While we may observe these external transactions, they are governed by unseen production functions, which determine how firms internally transform the input products they receive into output products that they sell. In this setting, it can be extremely valuable to infer these production functions, to improve supply chain visibility and to forecast future transactions more accurately. However, existing graph neural networks (GNNs) cannot capture these hidden relationships between nodes' inputs and outputs. Here, we introduce a new class of models for this setting by combining temporal GNNs with a novel inventory module, which learns production functions via attention weights and a special loss function. We evaluate our models extensively on real supply chains data and data generated from our new open-source simulator, SupplySim. Our models successfully infer production functions, outperforming the strongest baseline by 6%-50% (across datasets), and forecast future transactions, outperforming the strongest baseline by 11%-62%

View on arXiv
@article{chang2025_2407.18772,
  title={ Learning production functions for supply chains with graph neural networks },
  author={ Serina Chang and Zhiyin Lin and Benjamin Yan and Swapnil Bembde and Qi Xiu and Chi Heem Wong and Yu Qin and Frank Kloster and Alex Luo and Raj Palleti and Jure Leskovec },
  journal={arXiv preprint arXiv:2407.18772},
  year={ 2025 }
}
Comments on this paper