Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.04868
Cited By
Looking into Black Box Code Language Models
5 July 2024
Muhammad Umair Haider
Umar Farooq
A.B. Siddique
Mark Marron
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Looking into Black Box Code Language Models"
9 / 9 papers shown
Title
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq Joty
Guosheng Lin
258
1,532
0
02 Sep 2021
Unified Pre-training for Program Understanding and Generation
Wasi Uddin Ahmad
Saikat Chakraborty
Baishakhi Ray
Kai-Wei Chang
106
760
0
10 Mar 2021
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
352
2,051
0
31 Dec 2020
Transformer Feed-Forward Layers Are Key-Value Memories
Mor Geva
R. Schuster
Jonathan Berant
Omer Levy
KELM
99
792
0
29 Dec 2020
GraphCodeBERT: Pre-training Code Representations with Data Flow
Daya Guo
Shuo Ren
Shuai Lu
Zhangyin Feng
Duyu Tang
...
Dawn Drain
Neel Sundaresan
Jian Yin
Daxin Jiang
M. Zhou
113
1,111
0
17 Sep 2020
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
Zhangyin Feng
Daya Guo
Duyu Tang
Nan Duan
Xiaocheng Feng
...
Linjun Shou
Bing Qin
Ting Liu
Daxin Jiang
Ming Zhou
108
2,570
0
19 Feb 2020
Augmenting Self-attention with Persistent Memory
Sainbayar Sukhbaatar
Edouard Grave
Guillaume Lample
Hervé Jégou
Armand Joulin
RALM
KELM
43
136
0
02 Jul 2019
The emergence of number and syntax units in LSTM language models
Yair Lakretz
Germán Kruszewski
T. Desbordes
Dieuwke Hupkes
S. Dehaene
Marco Baroni
29
170
0
18 Mar 2019
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio
AIMat
308
27,205
0
01 Sep 2014
1