ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.01676
10
0

Deep Recommender Models Inference: Automatic Asymmetric Data Flow Optimization

2 July 2025
Giuseppe Ruggeri
Renzo Andri
Daniele Jahier Pagliari
Lukas Cavigelli
ArXiv (abs)PDFHTML
Main:3 Pages
5 Figures
Bibliography:2 Pages
1 Tables
Abstract

Deep Recommender Models (DLRMs) inference is a fundamental AI workload accounting for more than 79% of the total AI workload in Meta's data centers. DLRMs' performance bottleneck is found in the embedding layers, which perform many random memory accesses to retrieve small embedding vectors from tables of various sizes. We propose the design of tailored data flows to speedup embedding look-ups. Namely, we propose four strategies to look up an embedding table effectively on one core, and a framework to automatically map the tables asymmetrically to the multiple cores of a SoC. We assess the effectiveness of our method using the Huawei Ascend AI accelerators, comparing it with the default Ascend compiler, and we perform high-level comparisons with Nvidia A100. Results show a speed-up varying from 1.5x up to 6.5x for real workload distributions, and more than 20x for extremely unbalanced distributions. Furthermore, the method proves to be much more independent of the query distribution than the baseline.

View on arXiv
@article{ruggeri2025_2507.01676,
  title={ Deep Recommender Models Inference: Automatic Asymmetric Data Flow Optimization },
  author={ Giuseppe Ruggeri and Renzo Andri and Daniele Jahier Pagliari and Lukas Cavigelli },
  journal={arXiv preprint arXiv:2507.01676},
  year={ 2025 }
}
Comments on this paper