ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.08366
  4. Cited By
GraphCodeBERT: Pre-training Code Representations with Data Flow

GraphCodeBERT: Pre-training Code Representations with Data Flow

17 September 2020
Daya Guo
Shuo Ren
Shuai Lu
Zhangyin Feng
Duyu Tang
Shujie Liu
Long Zhou
Nan Duan
Alexey Svyatkovskiy
Shengyu Fu
Michele Tufano
Shao Kun Deng
Colin B. Clement
Dawn Drain
Neel Sundaresan
Jian Yin
Daxin Jiang
M. Zhou
ArXivPDFHTML

Papers citing "GraphCodeBERT: Pre-training Code Representations with Data Flow"

5 / 405 papers shown
Title
Evaluating Pre-Trained Models for User Feedback Analysis in Software
  Engineering: A Study on Classification of App-Reviews
Evaluating Pre-Trained Models for User Feedback Analysis in Software Engineering: A Study on Classification of App-Reviews
M. Hadi
Fatemeh H. Fard
15
30
0
12 Apr 2021
Do We Need Anisotropic Graph Neural Networks?
Do We Need Anisotropic Graph Neural Networks?
Shyam A. Tailor
Felix L. Opolka
Pietro Lio
Nicholas D. Lane
46
34
0
03 Apr 2021
Unified Pre-training for Program Understanding and Generation
Unified Pre-training for Program Understanding and Generation
Wasi Uddin Ahmad
Saikat Chakraborty
Baishakhi Ray
Kai-Wei Chang
23
748
0
10 Mar 2021
DOBF: A Deobfuscation Pre-Training Objective for Programming Languages
DOBF: A Deobfuscation Pre-Training Objective for Programming Languages
Baptiste Roziere
Marie-Anne Lachaux
Marc Szafraniec
Guillaume Lample
AI4CE
52
137
0
15 Feb 2021
CodeBLEU: a Method for Automatic Evaluation of Code Synthesis
CodeBLEU: a Method for Automatic Evaluation of Code Synthesis
Shuo Ren
Daya Guo
Shuai Lu
Long Zhou
Shujie Liu
Duyu Tang
Neel Sundaresan
M. Zhou
Ambrosio Blanco
Shuai Ma
ELM
40
501
0
22 Sep 2020
Previous
123456789