ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.10293
  4. Cited By
Forging Multiple Training Objectives for Pre-trained Language Models via
  Meta-Learning

Forging Multiple Training Objectives for Pre-trained Language Models via Meta-Learning

19 October 2022
Hongqiu Wu
Ruixue Ding
Haizhen Zhao
Boli Chen
Pengjun Xie
Fei Huang
Min Zhang
    MoMe
ArXivPDFHTML

Papers citing "Forging Multiple Training Objectives for Pre-trained Language Models via Meta-Learning"

3 / 3 papers shown
Title
Distributionally Robust Multilingual Machine Translation
Distributionally Robust Multilingual Machine Translation
Chunting Zhou
Daniel Levy
Xian Li
Marjan Ghazvininejad
Graham Neubig
73
24
0
09 Sep 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
338
11,684
0
09 Mar 2017
1