ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.14846
415
28
v1v2v3 (latest)

Supersonic: Learning to Generate Source Code Optimisations in C/C++

IEEE Transactions on Software Engineering (TSE), 2023
26 September 2023
Zimin Chen
Sen Fang
Monperrus Martin
ArXiv (abs)PDFHTML
Abstract

Software optimization refines programs for resource efficiency while preserving functionality. Traditionally, it is a process done by developers and compilers. This paper introduces a third option, automated optimization at the source code level. We present Supersonic, a neural approach targeting minor source code modifications for optimization. Using a seq2seq model, Supersonic is trained on C/C++ program pairs (xtx_{t}xt​, xt+1x_{t+1}xt+1​), where xt+1x_{t+1}xt+1​ is an optimized version of xtx_{t}xt​, and outputs a diff. Supersonic's performance is benchmarked against OpenAI's GPT-3.5-Turbo and GPT-4 on competitive programming tasks. The experiments show that Supersonic not only outperforms both models on the code optimization task, but also minimizes the extent of change with a more than 600x smaller than GPT-3.5-Turbo and 3700x smaller than GPT-4.

View on arXiv
Comments on this paper