ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.05751
84
1
v1v2v3 (latest)

A Multi-Level Superoptimizer for Tensor Programs

9 May 2024
Mengdi Wu
Xinhao Cheng
Shengyu Liu
Chunan Shi
    AI4CE
ArXiv (abs)PDFHTMLGithub (849★)
Main:14 Pages
14 Figures
Bibliography:4 Pages
5 Tables
Abstract

We introduce Mirage, the first multi-level superoptimizer for tensor programs. A key idea in Mirage is μ\muμGraphs, a uniform representation of tensor programs at the kernel, thread block, and thread levels of the GPU compute hierarchy. μ\muμGraphs enable Mirage to discover novel optimizations that combine algebraic transformations, schedule transformations, and generation of new custom kernels. To navigate the large search space, Mirage introduces a pruning technique based on abstraction that significantly reduces the search space and provides a certain optimality guarantee. To ensure that the optimized μ\muμGraph is equivalent to the input program, Mirage introduces a probabilistic equivalence verification procedure with strong theoretical guarantees. Our evaluation shows that Mirage outperforms existing approaches by up to 3.5×\times× even for DNNs that are widely used and heavily optimized. Mirage is publicly available at https://github.com/mirage-project/mirage.

View on arXiv
@article{wu2025_2405.05751,
  title={ Mirage: A Multi-Level Superoptimizer for Tensor Programs },
  author={ Mengdi Wu and Xinhao Cheng and Shengyu Liu and Chunan Shi and Jianan Ji and Kit Ao and Praveen Velliengiri and Xupeng Miao and Oded Padon and Zhihao Jia },
  journal={arXiv preprint arXiv:2405.05751},
  year={ 2025 }
}
Comments on this paper