Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.05379
Cited By
Multi-task Active Learning for Pre-trained Transformer-based Models
10 August 2022
Guy Rotman
Roi Reichart
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-task Active Learning for Pre-trained Transformer-based Models"
10 / 10 papers shown
Title
Balancing Accuracy, Calibration, and Efficiency in Active Learning with Vision Transformers Under Label Noise
Moseli Motsóehli
Hope Mogale
Kyungim Baek
38
0
0
07 May 2025
Assistive Image Annotation Systems with Deep Learning and Natural Language Capabilities: A Review
Moseli Motsóehli
VLM
3DV
32
0
0
28 Jun 2024
EASE: An Easily-Customized Annotation System Powered by Efficiency Enhancement Mechanisms
Naihao Deng
Yikai Liu
Mingye Chen
Winston Wu
Siyang Liu
Yulong Chen
Yue Zhang
Rada Mihalcea
34
0
0
23 May 2023
Active Prompting with Chain-of-Thought for Large Language Models
Shizhe Diao
Pengcheng Wang
Yong Lin
Tong Zhang
ReLM
KELM
LLMAG
LRM
31
121
0
23 Feb 2023
Semi-Automated Construction of Food Composition Knowledge Base
Jason Youn
Fangzhou Li
I. Tagkopoulos
39
0
0
24 Jan 2023
MEAL: Stable and Active Learning for Few-Shot Prompting
Abdullatif Köksal
Timo Schick
Hinrich Schütze
27
25
0
15 Nov 2022
A Survey of Active Learning for Natural Language Processing
Zhisong Zhang
Emma Strubell
Eduard H. Hovy
LM&MA
33
65
0
18 Oct 2022
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
243
290
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,145
0
06 Jun 2015
1