ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.03525
  4. Cited By
A Neural Pre-Conditioning Active Learning Algorithm to Reduce Label
  Complexity

A Neural Pre-Conditioning Active Learning Algorithm to Reduce Label Complexity

8 April 2021
Seo Taek Kong
Soomin Jeon
Dongbin Na
Jaewon Lee
Honglak Lee
Kyu-Hwan Jung
ArXivPDFHTML

Papers citing "A Neural Pre-Conditioning Active Learning Algorithm to Reduce Label Complexity"

8 / 8 papers shown
Title
Feasibility Study on Active Learning of Smart Surrogates for Scientific
  Simulations
Feasibility Study on Active Learning of Smart Surrogates for Scientific Simulations
Pradeep Bajracharya
J. Q. Toledo-Marín
Geoffrey C. Fox
S. Jha
Linwei Wang
AI4CE
45
1
0
10 Jul 2024
BWS: Best Window Selection Based on Sample Scores for Data Pruning
  across Broad Ranges
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges
Hoyong Choi
Nohyun Ki
Hye Won Chung
29
2
0
05 Jun 2024
NTKCPL: Active Learning on Top of Self-Supervised Model by Estimating
  True Coverage
NTKCPL: Active Learning on Top of Self-Supervised Model by Estimating True Coverage
Ziting Wen
Oscar Pizarro
Stefan B. Williams
19
2
0
07 Jun 2023
Active Semi-Supervised Learning by Exploring Per-Sample Uncertainty and
  Consistency
Active Semi-Supervised Learning by Exploring Per-Sample Uncertainty and Consistency
Jae-Kwang Lim
Jongkeun Na
Nojun Kwak
40
1
0
15 Mar 2023
Continuous Learning for Android Malware Detection
Continuous Learning for Android Malware Detection
Yizheng Chen
Zhoujie Ding
David A. Wagner
29
32
0
08 Feb 2023
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural
  Networks
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
Blake Bordelon
Abdulkadir Canatar
C. Pehlevan
144
201
0
07 Feb 2020
There Are Many Consistent Explanations of Unlabeled Data: Why You Should
  Average
There Are Many Consistent Explanations of Unlabeled Data: Why You Should Average
Ben Athiwaratkun
Marc Finzi
Pavel Izmailov
A. Wilson
199
243
0
14 Jun 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,138
0
06 Jun 2015
1