ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.11074
  4. Cited By
Preventing Catastrophic Forgetting in Continual Learning of New Natural
  Language Tasks

Preventing Catastrophic Forgetting in Continual Learning of New Natural Language Tasks

22 February 2023
Sudipta Kar
Giuseppe Castellucci
Simone Filice
S. Malmasi
Oleg Rokhlenko
    CLL
    KELM
ArXivPDFHTML

Papers citing "Preventing Catastrophic Forgetting in Continual Learning of New Natural Language Tasks"

4 / 4 papers shown
Title
Capturing Symmetry and Antisymmetry in Language Models through Symmetry-Aware Training Objectives
Capturing Symmetry and Antisymmetry in Language Models through Symmetry-Aware Training Objectives
Zhangdie Yuan
Andreas Vlachos
27
0
0
22 Apr 2025
A Survey of Mamba
A Survey of Mamba
Shuwei Shi
Shibing Chu
Rui An
Wenqi Fan
Yuee Xie
Hui Liu
Yuanping Chen
Qing Li
AI4CE
44
27
0
02 Aug 2024
Mix-CPT: A Domain Adaptation Framework via Decoupling Knowledge Learning
  and Format Alignment
Mix-CPT: A Domain Adaptation Framework via Decoupling Knowledge Learning and Format Alignment
Jinhao Jiang
Junyi Li
Wayne Xin Zhao
Yang Song
Tao Zhang
Ji-Rong Wen
CLL
41
3
0
15 Jul 2024
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
1