ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.00193
  4. Cited By
Data Dropout: Optimizing Training Data for Convolutional Neural Networks
v1v2 (latest)

Data Dropout: Optimizing Training Data for Convolutional Neural Networks

1 September 2018
Tianyang Wang
Jun Huan
Bo Li
    OOD
ArXiv (abs)PDFHTML

Papers citing "Data Dropout: Optimizing Training Data for Convolutional Neural Networks"

16 / 16 papers shown
Title
Progressive Data Dropout: An Embarrassingly Simple Approach to Faster Training
Progressive Data Dropout: An Embarrassingly Simple Approach to Faster Training
S. Srinivasan
Xinyue Hao
Shihao Hou
Yang Lu
Laura Sevilla-Lara
Anurag Arnab
Shreyank N Gowda
66
0
0
28 May 2025
Exploring Example Influence in Continual Learning
Exploring Example Influence in Continual Learning
Q. Sun
Fan Lyu
Fanhua Shang
Wei Feng
Liang Wan
CLL
128
49
0
25 Sep 2022
Metadata Archaeology: Unearthing Data Subsets by Leveraging Training
  Dynamics
Metadata Archaeology: Unearthing Data Subsets by Leveraging Training Dynamics
Shoaib Ahmed Siddiqui
Nitarshan Rajkumar
Tegan Maharaj
David M. Krueger
Sara Hooker
129
28
0
20 Sep 2022
DIWIFT: Discovering Instance-wise Influential Features for Tabular Data
DIWIFT: Discovering Instance-wise Influential Features for Tabular Data
Dugang Liu
Pengxiang Cheng
Hong Zhu
Xing Tang
Yanyu Chen
Xiaoting Wang
Weike Pan
Zhong Ming
Xiuqiang He
TDI
58
11
0
06 Jul 2022
Bamboo: Making Preemptible Instances Resilient for Affordable Training
  of Large DNNs
Bamboo: Making Preemptible Instances Resilient for Affordable Training of Large DNNs
John Thorpe
Pengzhan Zhao
Jon Eyolfson
Yifan Qiao
Zhihao Jia
Minjia Zhang
Ravi Netravali
Guoqing Harry Xu
85
58
0
26 Apr 2022
Rethinking Influence Functions of Neural Networks in the
  Over-parameterized Regime
Rethinking Influence Functions of Neural Networks in the Over-parameterized Regime
Rui Zhang
Shihua Zhang
TDI
70
23
0
15 Dec 2021
Scaling Up Influence Functions
Scaling Up Influence Functions
Andrea Schioppa
Polina Zablotskaia
David Vilar
Artem Sokolov
TDI
124
106
0
06 Dec 2021
Unified Regularity Measures for Sample-wise Learning and Generalization
Unified Regularity Measures for Sample-wise Learning and Generalization
Chi Zhang
Xiaoning Ma
Yu Liu
Le Wang
Yuanqi Su
Yuehu Liu
67
1
0
09 Aug 2021
Efficient Neural Architecture Search with Performance Prediction
Efficient Neural Architecture Search with Performance Prediction
Ibrahim Alshubaily
48
3
0
04 Aug 2021
Convolutional Neural Network(CNN/ConvNet) in Stock Price Movement
  Prediction
Convolutional Neural Network(CNN/ConvNet) in Stock Price Movement Prediction
Kunal Bhardwaj
66
4
0
03 Jun 2021
Finding Influential Instances for Distantly Supervised Relation
  Extraction
Finding Influential Instances for Distantly Supervised Relation Extraction
Zifeng Wang
Rui Wen
Xi Chen
Shao-Lun Huang
Ningyu Zhang
Yefeng Zheng
TDI
82
23
0
17 Sep 2020
Generative Data Augmentation for Commonsense Reasoning
Generative Data Augmentation for Commonsense Reasoning
Yiben Yang
Chaitanya Malaviya
Jared Fernandez
Swabha Swayamdipta
Ronan Le Bras
Ji-ping Wang
Chandra Bhagavatula
Yejin Choi
Doug Downey
LRM
84
90
0
24 Apr 2020
Influence Function based Data Poisoning Attacks to Top-N Recommender
  Systems
Influence Function based Data Poisoning Attacks to Top-N Recommender Systems
Minghong Fang
Neil Zhenqiang Gong
Jia-Wei Liu
TDI
108
155
0
19 Feb 2020
Identifying Mislabeled Data using the Area Under the Margin Ranking
Identifying Mislabeled Data using the Area Under the Margin Ranking
Geoff Pleiss
Tianyi Zhang
Ethan R. Elenberg
Kilian Q. Weinberger
NoLa
175
274
0
28 Jan 2020
Less Is Better: Unweighted Data Subsampling via Influence Function
Less Is Better: Unweighted Data Subsampling via Influence Function
Zifeng Wang
Hong Zhu
Zhenhua Dong
Xiuqiang He
Shao-Lun Huang
TDI
103
54
0
03 Dec 2019
Distribution Density, Tails, and Outliers in Machine Learning: Metrics
  and Applications
Distribution Density, Tails, and Outliers in Machine Learning: Metrics and Applications
Nicholas Carlini
Ulfar Erlingsson
Nicolas Papernot
OODOODD
71
61
0
29 Oct 2019
1