ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.14248
  4. Cited By
From Static to Dynamic: A Continual Learning Framework for Large
  Language Models

From Static to Dynamic: A Continual Learning Framework for Large Language Models

22 October 2023
Mingzhe Du
Anh Tuan Luu
Bin Ji
See-kiong Ng
ArXivPDFHTML

Papers citing "From Static to Dynamic: A Continual Learning Framework for Large Language Models"

5 / 5 papers shown
Title
What Language Model to Train if You Have One Million GPU Hours?
What Language Model to Train if You Have One Million GPU Hours?
Teven Le Scao
Thomas Wang
Daniel Hesslow
Lucile Saulnier
Stas Bekman
...
Lintang Sutawika
Jaesung Tae
Zheng-Xin Yong
Julien Launay
Iz Beltagy
MoE
AI4CE
261
107
0
27 Oct 2022
Fact or Fiction: Verifying Scientific Claims
Fact or Fiction: Verifying Scientific Claims
David Wadden
Shanchuan Lin
Kyle Lo
Lucy Lu Wang
Madeleine van Zuylen
Arman Cohan
Hannaneh Hajishirzi
HAI
126
452
0
30 Apr 2020
Boilerplate Removal using a Neural Sequence Labeling Model
Boilerplate Removal using a Neural Sequence Labeling Model
Jurek Leonhardt
Avishek Anand
Megha Khosla
VLM
29
22
0
22 Apr 2020
The Web as a Knowledge-base for Answering Complex Questions
The Web as a Knowledge-base for Answering Complex Questions
Alon Talmor
Jonathan Berant
79
580
0
18 Mar 2018
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
650
31,490
0
16 Jan 2013
1