Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2506.14761
Cited By
From Bytes to Ideas: Language Modeling with Autoregressive U-Nets
17 June 2025
Mathurin Videau
Badr Youbi Idrissi
Alessandro Leite
Marc Schoenauer
O. Teytaud
David Lopez-Paz
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"From Bytes to Ideas: Language Modeling with Autoregressive U-Nets"
4 / 4 papers shown
Title
CUTE: Measuring LLMs' Understanding of Their Tokens
Lukas Edman
Helmut Schmid
Alexander Fraser
56
7
0
23 Sep 2024
Training LLMs over Neurally Compressed Text
Brian Lester
Jaehoon Lee
A. Alemi
Jeffrey Pennington
Adam Roberts
Jascha Narain Sohl-Dickstein
Noah Constant
67
7
0
04 Apr 2024
MEGABYTE: Predicting Million-byte Sequences with Multiscale Transformers
L. Yu
Daniel Simig
Colin Flaherty
Armen Aghajanyan
Luke Zettlemoyer
M. Lewis
63
93
0
12 May 2023
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
608
4,893
0
23 Jan 2020
1