Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.08160
Cited By
v1
v2
v3 (latest)
On the Role of Context in Reading Time Prediction
12 September 2024
Andreas Opedal
Eleanor Chodroff
Ryan Cotterell
Ethan Gotlieb Wilcox
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"On the Role of Context in Reading Time Prediction"
11 / 11 papers shown
Title
Surprise! Uniform Information Density Isn't the Whole Story: Predicting Surprisal Contours in Long-form Discourse
Eleftheria Tsipidi
Franz Nowak
Ryan Cotterell
Ethan Gotlieb Wilcox
Mario Giulianelli
Alex Warstadt
63
4
0
21 Oct 2024
Testing the Predictions of Surprisal Theory in 11 Languages
Ethan Gotlieb Wilcox
Tiago Pimentel
Clara Meister
Ryan Cotterell
R. Levy
LRM
102
70
0
07 Jul 2023
Transformer-Based Language Model Surprisal Predicts Human Reading Times Best with About Two Billion Training Tokens
Byung-Doh Oh
William Schuler
89
31
0
22 Apr 2023
A Measure-Theoretic Characterization of Tight Language Models
Li Du
Lucas Torroba Hennigen
Tiago Pimentel
Clara Meister
Jason Eisner
Ryan Cotterell
72
32
0
20 Dec 2022
Context Limitations Make Neural Language Models More Human-Like
Tatsuki Kuribayashi
Yohei Oseki
Ana Brassard
Kentaro Inui
69
30
0
23 May 2022
mGPT: Few-Shot Learners Go Multilingual
Oleh Shliazhko
Alena Fenogenova
Maria Tikhonova
Vladislav Mikhailov
Anastasia Kozlova
Tatiana Shavrina
90
154
0
15 Apr 2022
Revisiting the Uniform Information Density Hypothesis
Clara Meister
Tiago Pimentel
Patrick Haller
Lena Jäger
Ryan Cotterell
R. Levy
86
77
0
23 Sep 2021
Modeling the Unigram Distribution
Irene Nikkarinen
Tiago Pimentel
Damián E. Blasi
Ryan Cotterell
33
8
0
04 Jun 2021
Lower Perplexity is Not Always Human-Like
Tatsuki Kuribayashi
Yohei Oseki
Takumi Ito
Ryo Yoshida
Masayuki Asahara
Kentaro Inui
50
76
0
02 Jun 2021
On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior
Ethan Gotlieb Wilcox
Jon Gauthier
Jennifer Hu
Peng Qian
R. Levy
49
169
0
02 Jun 2020
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Colin Raffel
Noam M. Shazeer
Adam Roberts
Katherine Lee
Sharan Narang
Michael Matena
Yanqi Zhou
Wei Li
Peter J. Liu
AIMat
483
20,317
0
23 Oct 2019
1