ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1511.06433
  4. Cited By
Blending LSTMs into CNNs

Blending LSTMs into CNNs

19 November 2015
Krzysztof J. Geras
Abdel-rahman Mohamed
R. Caruana
G. Urban
Shengjie Wang
Ozlem Aslan
Matthai Philipose
Matthew Richardson
Charles Sutton
ArXivPDFHTML

Papers citing "Blending LSTMs into CNNs"

10 / 10 papers shown
Title
Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic
  Speech Recognition
Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition
J. Yoon
Beom Jun Woo
Sunghwan Ahn
Hyeon Seung Lee
N. Kim
VLM
31
9
0
28 Nov 2022
Oracle Teacher: Leveraging Target Information for Better Knowledge
  Distillation of CTC Models
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
43
1
0
05 Nov 2021
Learning Energy-Based Approximate Inference Networks for Structured
  Applications in NLP
Learning Energy-Based Approximate Inference Networks for Structured Applications in NLP
Lifu Tu
BDL
35
0
0
27 Aug 2021
Tracking-by-Trackers with a Distilled and Reinforced Model
Tracking-by-Trackers with a Distilled and Reinforced Model
Matteo Dunnhofer
N. Martinel
C. Micheloni
VOT
OffRL
27
4
0
08 Jul 2020
DENS-ECG: A Deep Learning Approach for ECG Signal Delineation
DENS-ECG: A Deep Learning Approach for ECG Signal Delineation
A. Peimankar
S. Puthusserypady
38
119
0
18 May 2020
FEED: Feature-level Ensemble for Knowledge Distillation
FEED: Feature-level Ensemble for Knowledge Distillation
Seonguk Park
Nojun Kwak
FedML
31
41
0
24 Sep 2019
Towards Principled Design of Deep Convolutional Networks: Introducing
  SimpNet
Towards Principled Design of Deep Convolutional Networks: Introducing SimpNet
S. H. HasanPour
Mohammad Rouhani
Mohsen Fayyaz
Mohammad Sabokrou
Ehsan Adeli
50
45
0
17 Feb 2018
Sequence-Level Knowledge Distillation
Sequence-Level Knowledge Distillation
Yoon Kim
Alexander M. Rush
47
1,101
0
25 Jun 2016
Active Long Term Memory Networks
Active Long Term Memory Networks
Tommaso Furlanello
Jiaping Zhao
Andrew M. Saxe
Laurent Itti
B. Tjan
KELM
CLL
32
41
0
07 Jun 2016
Do Deep Convolutional Nets Really Need to be Deep and Convolutional?
Do Deep Convolutional Nets Really Need to be Deep and Convolutional?
G. Urban
Krzysztof J. Geras
Samira Ebrahimi Kahou
Ozlem Aslan
Shengjie Wang
R. Caruana
Abdel-rahman Mohamed
Matthai Philipose
Matthew Richardson
22
47
0
17 Mar 2016
1