Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.08366
Cited By
GraphCodeBERT: Pre-training Code Representations with Data Flow
17 September 2020
Daya Guo
Shuo Ren
Shuai Lu
Zhangyin Feng
Duyu Tang
Shujie Liu
Long Zhou
Nan Duan
Alexey Svyatkovskiy
Shengyu Fu
Michele Tufano
Shao Kun Deng
Colin B. Clement
Dawn Drain
Neel Sundaresan
Jian Yin
Daxin Jiang
M. Zhou
Re-assign community
ArXiv
PDF
HTML
Papers citing
"GraphCodeBERT: Pre-training Code Representations with Data Flow"
5 / 405 papers shown
Title
Evaluating Pre-Trained Models for User Feedback Analysis in Software Engineering: A Study on Classification of App-Reviews
M. Hadi
Fatemeh H. Fard
15
30
0
12 Apr 2021
Do We Need Anisotropic Graph Neural Networks?
Shyam A. Tailor
Felix L. Opolka
Pietro Lio
Nicholas D. Lane
46
34
0
03 Apr 2021
Unified Pre-training for Program Understanding and Generation
Wasi Uddin Ahmad
Saikat Chakraborty
Baishakhi Ray
Kai-Wei Chang
23
748
0
10 Mar 2021
DOBF: A Deobfuscation Pre-Training Objective for Programming Languages
Baptiste Roziere
Marie-Anne Lachaux
Marc Szafraniec
Guillaume Lample
AI4CE
52
137
0
15 Feb 2021
CodeBLEU: a Method for Automatic Evaluation of Code Synthesis
Shuo Ren
Daya Guo
Shuai Lu
Long Zhou
Shujie Liu
Duyu Tang
Neel Sundaresan
M. Zhou
Ambrosio Blanco
Shuai Ma
ELM
40
501
0
22 Sep 2020
Previous
1
2
3
4
5
6
7
8
9