Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.17067
Cited By
v1
v2 (latest)
Tokenization Matters! Degrading Large Language Models through Challenging Their Tokenization
27 May 2024
Dixuan Wang
Yanda Li
Junyuan Jiang
Zepeng Ding
Ziqin Luo
Guochao Jiang
Jiaqing Liang
Deqing Yang
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Tokenization Matters! Degrading Large Language Models through Challenging Their Tokenization"
2 / 52 papers shown
Title
Subword Regularization: Improving Neural Network Translation Models with Multiple Subword Candidates
Taku Kudo
278
1,173
0
29 Apr 2018
Neural Machine Translation of Rare Words with Subword Units
Rico Sennrich
Barry Haddow
Alexandra Birch
287
7,775
0
31 Aug 2015
Previous
1
2