Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.23771
Cited By
What is Wrong with Perplexity for Long-context Language Modeling?
31 October 2024
Lizhe Fang
Yifei Wang
Zhaoyang Liu
Chenheng Zhang
Stefanie Jegelka
Jinyang Gao
Bolin Ding
Yisen Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What is Wrong with Perplexity for Long-context Language Modeling?"
5 / 5 papers shown
Title
SoLoPO: Unlocking Long-Context Capabilities in LLMs via Short-to-Long Preference Optimization
Huashan Sun
Shengyi Liao
Yansen Han
Yu Bai
Yang Gao
...
Weizhou Shen
Fanqi Wan
Ming Yan
J. Zhang
Fei Huang
12
0
0
16 May 2025
Delta Attention: Fast and Accurate Sparse Attention Inference by Delta Correction
Jeffrey Willette
Heejun Lee
Sung Ju Hwang
12
0
0
16 May 2025
RWKV-X: A Linear Complexity Hybrid Language Model
Haowen Hou
Zhiyi Huang
Kaifeng Tan
Rongchang Lu
Fei Richard Yu
VLM
78
0
0
30 Apr 2025
ConSens: Assessing context grounding in open-book question answering
Ivan Vankov
Matyo Ivanov
Adriana Correia
Victor Botev
ELM
63
0
0
30 Apr 2025
When Precision Meets Position: BFloat16 Breaks Down RoPE in Long-Context Training
Haonan Wang
Qian Liu
Chao Du
Tongyao Zhu
Cunxiao Du
Kenji Kawaguchi
Tianyu Pang
115
6
0
20 Nov 2024
1