ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1807.04511
  4. Cited By
Training Neural Networks Using Features Replay

Training Neural Networks Using Features Replay

12 July 2018
Zhouyuan Huo
Bin Gu
Heng-Chiao Huang
ArXivPDFHTML

Papers citing "Training Neural Networks Using Features Replay"

15 / 15 papers shown
Title
HPFF: Hierarchical Locally Supervised Learning with Patch Feature Fusion
HPFF: Hierarchical Locally Supervised Learning with Patch Feature Fusion
Junhao Su
Chenghao He
Feiyu Zhu
Xiaojie Xu
Dongzhi Guan
Chenyang Si
56
2
0
08 Jul 2024
PETRA: Parallel End-to-end Training with Reversible Architectures
PETRA: Parallel End-to-end Training with Reversible Architectures
Stéphane Rivaud
Louis Fournier
Thomas Pumir
Eugene Belilovsky
Michael Eickenberg
Edouard Oyallon
25
0
0
04 Jun 2024
FedImpro: Measuring and Improving Client Update in Federated Learning
FedImpro: Measuring and Improving Client Update in Federated Learning
Zhenheng Tang
Yonggang Zhang
S. Shi
Xinmei Tian
Tongliang Liu
Bo Han
Xiaowen Chu
FedML
26
13
0
10 Feb 2024
Local Learning with Neuron Groups
Local Learning with Neuron Groups
Adeetya Patel
Michael Eickenberg
Eugene Belilovsky
29
5
0
18 Jan 2023
Block-wise Training of Residual Networks via the Minimizing Movement
  Scheme
Block-wise Training of Residual Networks via the Minimizing Movement Scheme
Skander Karkar
Ibrahim Ayed
Emmanuel de Bézenac
Patrick Gallinari
33
1
0
03 Oct 2022
Layer-Wise Partitioning and Merging for Efficient and Scalable Deep
  Learning
Layer-Wise Partitioning and Merging for Efficient and Scalable Deep Learning
S. Akintoye
Liangxiu Han
H. Lloyd
Xin Zhang
Darren Dancey
Haoming Chen
Daoqiang Zhang
FedML
34
5
0
22 Jul 2022
SplitEasy: A Practical Approach for Training ML models on Mobile Devices
SplitEasy: A Practical Approach for Training ML models on Mobile Devices
Kamalesh Palanisamy
Vivek Khimani
Moin Hussain Moti
Dimitris Chatzopoulos
22
20
0
09 Nov 2020
Privacy-Preserving Asynchronous Federated Learning Algorithms for
  Multi-Party Vertically Collaborative Learning
Privacy-Preserving Asynchronous Federated Learning Algorithms for Multi-Party Vertically Collaborative Learning
Bin Gu
An Xu
Zhouyuan Huo
Cheng Deng
Heng-Chiao Huang
FedML
38
27
0
14 Aug 2020
Pipelined Backpropagation at Scale: Training Large Models without
  Batches
Pipelined Backpropagation at Scale: Training Large Models without Batches
Atli Kosson
Vitaliy Chiley
Abhinav Venigalla
Joel Hestness
Urs Koster
35
33
0
25 Mar 2020
Pipelined Training with Stale Weights of Deep Convolutional Neural
  Networks
Pipelined Training with Stale Weights of Deep Convolutional Neural Networks
Lifu Zhang
T. Abdelrahman
21
0
0
29 Dec 2019
Fully Decoupled Neural Network Learning Using Delayed Gradients
Fully Decoupled Neural Network Learning Using Delayed Gradients
Huiping Zhuang
Yi Wang
Qinglai Liu
Shuai Zhang
Zhiping Lin
FedML
14
29
0
21 Jun 2019
Associated Learning: Decomposing End-to-end Backpropagation based on
  Auto-encoders and Target Propagation
Associated Learning: Decomposing End-to-end Backpropagation based on Auto-encoders and Target Propagation
Yu-Wei Kao
Hung-Hsuan Chen
BDL
20
5
0
13 Jun 2019
Improving Discrete Latent Representations With Differentiable
  Approximation Bridges
Improving Discrete Latent Representations With Differentiable Approximation Bridges
Jason Ramapuram
Russ Webb
DRL
19
9
0
09 May 2019
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
151
602
0
14 Feb 2016
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
267
13,368
0
25 Aug 2014
1