Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.04836
Cited By
Revisiting Catastrophic Forgetting in Large Language Model Tuning
7 June 2024
Hongyu Li
Liang Ding
Meng Fang
Dacheng Tao
CLL
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Catastrophic Forgetting in Large Language Model Tuning"
5 / 5 papers shown
Title
SEFE: Superficial and Essential Forgetting Eliminator for Multimodal Continual Instruction Tuning
Jinpeng Chen
Runmin Cong
Yuzhi Zhao
Hongzheng Yang
Guangneng Hu
H. Ip
Sam Kwong
CLL
KELM
83
0
0
05 May 2025
R1-T1: Fully Incentivizing Translation Capability in LLMs via Reasoning Learning
Minggui He
Yilun Liu
Shimin Tao
Yuanchang Luo
Hongyong Zeng
...
Daimeng Wei
Weibin Meng
Hao Yang
Boxing Chen
Osamu Yoshie
LRM
63
2
0
27 Feb 2025
Towards Making the Most of ChatGPT for Machine Translation
Keqin Peng
Liang Ding
Qihuang Zhong
Li Shen
Xuebo Liu
Min Zhang
Y. Ouyang
Dacheng Tao
LRM
85
210
0
24 Mar 2023
Fine-tuned Language Models are Continual Learners
Thomas Scialom
Tuhin Chakrabarty
Smaranda Muresan
CLL
LRM
145
117
0
24 May 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
1