ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.08320
  4. Cited By
On the Importance of Effectively Adapting Pretrained Language Models for
  Active Learning

On the Importance of Effectively Adapting Pretrained Language Models for Active Learning

16 April 2021
Katerina Margatina
Loïc Barrault
Nikolaos Aletras
ArXivPDFHTML

Papers citing "On the Importance of Effectively Adapting Pretrained Language Models for Active Learning"

11 / 11 papers shown
Title
Self-Training for Sample-Efficient Active Learning for Text
  Classification with Pre-Trained Language Models
Self-Training for Sample-Efficient Active Learning for Text Classification with Pre-Trained Language Models
Christopher Schröder
Gerhard Heyer
VLM
44
0
0
13 Jun 2024
AnchorAL: Computationally Efficient Active Learning for Large and
  Imbalanced Datasets
AnchorAL: Computationally Efficient Active Learning for Large and Imbalanced Datasets
Pietro Lesci
Andreas Vlachos
35
2
0
08 Apr 2024
STAR: Constraint LoRA with Dynamic Active Learning for Data-Efficient
  Fine-Tuning of Large Language Models
STAR: Constraint LoRA with Dynamic Active Learning for Data-Efficient Fine-Tuning of Large Language Models
Linhai Zhang
Jialong Wu
Deyu Zhou
Guoqiang Xu
30
4
0
02 Mar 2024
Active Learning Principles for In-Context Learning with Large Language
  Models
Active Learning Principles for In-Context Learning with Large Language Models
Katerina Margatina
Timo Schick
Nikolaos Aletras
Jane Dwivedi-Yu
27
39
0
23 May 2023
Rethinking Semi-supervised Learning with Language Models
Rethinking Semi-supervised Learning with Language Models
Zhengxiang Shi
Francesco Tonolini
Nikolaos Aletras
Emine Yilmaz
G. Kazai
Yunlong Jiao
32
18
0
22 May 2023
Investigating Multi-source Active Learning for Natural Language
  Inference
Investigating Multi-source Active Learning for Natural Language Inference
Ard Snijders
Douwe Kiela
Katerina Margatina
24
7
0
14 Feb 2023
An Efficient Active Learning Pipeline for Legal Text Classification
An Efficient Active Learning Pipeline for Legal Text Classification
Sepideh Mamooler
R. Lebret
Stéphane Massonnet
Karl Aberer
AILaw
21
4
0
15 Nov 2022
A Survey of Active Learning for Natural Language Processing
A Survey of Active Learning for Natural Language Processing
Zhisong Zhang
Emma Strubell
Eduard H. Hovy
LM&MA
33
65
0
18 Oct 2022
Cold-start Active Learning through Self-supervised Language Modeling
Cold-start Active Learning through Self-supervised Language Modeling
Michelle Yuan
Hsuan-Tien Lin
Jordan L. Boyd-Graber
116
180
0
19 Oct 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,138
0
06 Jun 2015
1