ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.11661
  4. Cited By
TrustAL: Trustworthy Active Learning using Knowledge Distillation

TrustAL: Trustworthy Active Learning using Knowledge Distillation

26 January 2022
Beong-woo Kwak
Youngwook Kim
Yu Jin Kim
Seung-won Hwang
Jinyoung Yeo
ArXivPDFHTML

Papers citing "TrustAL: Trustworthy Active Learning using Knowledge Distillation"

6 / 6 papers shown
Title
Applications of Knowledge Distillation in Remote Sensing: A Survey
Applications of Knowledge Distillation in Remote Sensing: A Survey
Yassine Himeur
N. Aburaed
O. Elharrouss
Iraklis Varlamis
Shadi Atalla
W. Mansoor
Hussain Al Ahmad
45
4
0
18 Sep 2024
A Survey on Deep Active Learning: Recent Advances and New Frontiers
A Survey on Deep Active Learning: Recent Advances and New Frontiers
Dongyuan Li
Zhen Wang
Yankai Chen
Renhe Jiang
Weiping Ding
Manabu Okumura
46
20
0
01 May 2024
ActiveGLAE: A Benchmark for Deep Active Learning with Transformers
ActiveGLAE: A Benchmark for Deep Active Learning with Transformers
Lukas Rauch
Matthias Aßenmacher
Denis Huseljic
Moritz Wirth
Bernd Bischl
Bernhard Sick
38
11
0
16 Jun 2023
A Review of Deep Learning for Video Captioning
A Review of Deep Learning for Video Captioning
Moloud Abdar
Meenakshi Kollati
Swaraja Kuraparthi
Farhad Pourpanah
Daniel J. McDuff
...
Shuicheng Yan
Abduallah A. Mohamed
Abbas Khosravi
Min Zhang
Fatih Porikli
3DV
37
21
0
22 Apr 2023
Active learning for data streams: a survey
Active learning for data streams: a survey
Davide Cacciarelli
M. Kulahci
28
40
0
17 Feb 2023
Cold-start Active Learning through Self-supervised Language Modeling
Cold-start Active Learning through Self-supervised Language Modeling
Michelle Yuan
Hsuan-Tien Lin
Jordan L. Boyd-Graber
116
180
0
19 Oct 2020
1