Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.01380
Cited By
Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition
2 October 2024
Jiyeon Kim
Hyunji Lee
Hyowon Cho
Joel Jang
Hyeonbin Hwang
Seungpil Won
Youbin Ahn
Dohaeng Lee
Minjoon Seo
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition"
2 / 2 papers shown
Title
Robust Federated Learning with Confidence-Weighted Filtering and GAN-Based Completion under Noisy and Incomplete Data
Alpaslan Gokcen
Ali Boyaci
FedML
43
0
0
14 May 2025
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
Yixin Ou
Yunzhi Yao
N. Zhang
Hui Jin
Jiacheng Sun
Shumin Deng
Zechao Li
H. Chen
KELM
CLL
54
0
0
16 Feb 2025
1