Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.13508
Cited By
Naturalness of Attention: Revisiting Attention in Code Language Models
22 November 2023
M. Saad
Tushar Sharma
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Naturalness of Attention: Revisiting Attention in Code Language Models"
5 / 5 papers shown
Title
An Exploratory Study on Code Attention in BERT
Rishab Sharma
Fuxiang Chen
Fatemeh H. Fard
David Lo
39
26
0
05 Apr 2022
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq Joty
Guosheng Lin
282
1,560
0
02 Sep 2021
GraphCodeBERT: Pre-training Code Representations with Data Flow
Daya Guo
Shuo Ren
Shuai Lu
Zhangyin Feng
Duyu Tang
...
Dawn Drain
Neel Sundaresan
Jian Yin
Daxin Jiang
M. Zhou
134
1,128
0
17 Sep 2020
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng
Daya Guo
Duyu Tang
Nan Duan
Xiaocheng Feng
...
Linjun Shou
Bing Qin
Ting Liu
Daxin Jiang
Ming Zhou
160
2,633
0
19 Feb 2020
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhiwen Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
894
6,788
0
26 Sep 2016
1