ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.09964
  4. Cited By
When Neural Code Completion Models Size up the Situation: Attaining
  Cheaper and Faster Completion through Dynamic Model Inference

When Neural Code Completion Models Size up the Situation: Attaining Cheaper and Faster Completion through Dynamic Model Inference

18 January 2024
Zhensu Sun
Xiaoning Du
Fu Song
Shangwen Wang
Li Li
ArXivPDFHTML

Papers citing "When Neural Code Completion Models Size up the Situation: Attaining Cheaper and Faster Completion through Dynamic Model Inference"

4 / 4 papers shown
Title
GREEN-CODE: Learning to Optimize Energy Efficiency in LLM-based Code Generation
GREEN-CODE: Learning to Optimize Energy Efficiency in LLM-based Code Generation
Shashikant Ilager
Lukas Florian Briem
Ivona Brandić
34
0
0
19 Jan 2025
On the Compression of Language Models for Code: An Empirical Study on
  CodeBERT
On the Compression of Language Models for Code: An Empirical Study on CodeBERT
Giordano dÁloisio
Luca Traini
Federica Sarro
A. Marco
68
1
0
18 Dec 2024
Productivity Assessment of Neural Code Completion
Productivity Assessment of Neural Code Completion
Albert Ziegler
Eirini Kalliamvakou
Shawn Simister
Ganesh Sittampalam
Alice Li
Andrew Rice
Devon Rifkin
E. Aftandilian
102
177
0
13 May 2022
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
238
1,489
0
02 Sep 2021
1