ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.07598
  4. Cited By
StackRec: Efficient Training of Very Deep Sequential Recommender Models
  by Iterative Stacking

StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking

14 December 2020
Jiachun Wang
Fajie Yuan
Jian Chen
Qingyao Wu
Min Yang
Yang Sun
Guoxiao Zhang
    BDL
ArXivPDFHTML

Papers citing "StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking"

6 / 6 papers shown
Title
Efficient Training of Large Vision Models via Advanced Automated
  Progressive Learning
Efficient Training of Large Vision Models via Advanced Automated Progressive Learning
Changlin Li
Jiawei Zhang
Sihao Lin
Zongxin Yang
Junwei Liang
Xiaodan Liang
Xiaojun Chang
VLM
23
0
0
06 Sep 2024
A Multi-Level Framework for Accelerating Training Transformer Models
A Multi-Level Framework for Accelerating Training Transformer Models
Longwei Zou
Han Zhang
Yangdong Deng
AI4CE
32
1
0
07 Apr 2024
EfficientTrain: Exploring Generalized Curriculum Learning for Training
  Visual Backbones
EfficientTrain: Exploring Generalized Curriculum Learning for Training Visual Backbones
Yulin Wang
Yang Yue
Rui Lu
Tian-De Liu
Zhaobai Zhong
S. Song
Gao Huang
34
28
0
17 Nov 2022
Automated Progressive Learning for Efficient Training of Vision
  Transformers
Automated Progressive Learning for Efficient Training of Vision Transformers
Changlin Li
Bohan Zhuang
Guangrun Wang
Xiaodan Liang
Xiaojun Chang
Yi Yang
26
46
0
28 Mar 2022
Scene-adaptive Knowledge Distillation for Sequential Recommendation via
  Differentiable Architecture Search
Scene-adaptive Knowledge Distillation for Sequential Recommendation via Differentiable Architecture Search
Lei-tai Chen
Fajie Yuan
Jiaxi Yang
Min Yang
Chengming Li
11
3
0
15 Jul 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
1