ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.13138
  4. Cited By
Investigating the Effectiveness of Representations Based on Pretrained
  Transformer-based Language Models in Active Learning for Labelling Text
  Datasets

Investigating the Effectiveness of Representations Based on Pretrained Transformer-based Language Models in Active Learning for Labelling Text Datasets

21 April 2020
Jinghui Lu
B. MacNamee
ArXivPDFHTML

Papers citing "Investigating the Effectiveness of Representations Based on Pretrained Transformer-based Language Models in Active Learning for Labelling Text Datasets"

2 / 2 papers shown
Title
A Rationale-Centric Framework for Human-in-the-loop Machine Learning
A Rationale-Centric Framework for Human-in-the-loop Machine Learning
Jinghui Lu
Linyi Yang
Brian Mac Namee
Yue Zhang
27
39
0
24 Mar 2022
Diversity Enhanced Active Learning with Strictly Proper Scoring Rules
Diversity Enhanced Active Learning with Strictly Proper Scoring Rules
Wei Tan
Lan Du
Wray L. Buntine
16
30
0
27 Oct 2021
1