ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.12136
  4. Cited By
Self-Attention Meta-Learner for Continual Learning

Self-Attention Meta-Learner for Continual Learning

28 January 2021
Ghada Sokar
Decebal Constantin Mocanu
Mykola Pechenizkiy
    CLL
ArXivPDFHTML

Papers citing "Self-Attention Meta-Learner for Continual Learning"

4 / 4 papers shown
Title
Hierarchically Structured Task-Agnostic Continual Learning
Hierarchically Structured Task-Agnostic Continual Learning
Heinke Hihn
Daniel A. Braun
BDL
CLL
21
8
0
14 Nov 2022
Selecting Related Knowledge via Efficient Channel Attention for Online
  Continual Learning
Selecting Related Knowledge via Efficient Channel Attention for Online Continual Learning
Ya-nan Han
Jian-wei Liu
CLL
25
0
0
09 Sep 2022
Avoiding Forgetting and Allowing Forward Transfer in Continual Learning
  via Sparse Networks
Avoiding Forgetting and Allowing Forward Transfer in Continual Learning via Sparse Networks
Ghada Sokar
Decebal Constantin Mocanu
Mykola Pechenizkiy
CLL
35
8
0
11 Oct 2021
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
425
11,715
0
09 Mar 2017
1