ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1603.05691
43
47
v1v2v3v4 (latest)

Do Deep Convolutional Nets Really Need to be Deep (Or Even Convolutional)?

17 March 2016
G. Urban
Krzysztof J. Geras
Samira Ebrahimi Kahou
Ozlem Aslan
Shengjie Wang
R. Caruana
Abdel-rahman Mohamed
Matthai Philipose
Matthew Richardson
ArXiv (abs)PDFHTML
Abstract

Yes, apparently they do. Previous research demonstrated that shallow feed-forward nets sometimes can learn the complex functions previously learned by deep nets while using a similar number of parameters as the deep models they mimic. In this paper we investigate if shallow models can learn to mimic the functions learned by deep convolutional models. We experiment with shallow models and models with a varying number of convolutional layers, all trained to mimic a state-of-the-art ensemble of CIFAR- 10 models. We demonstrate that we are unable to train shallow models to be of comparable accuracy to deep convolutional models. Although the student models do not have to be as deep as the teacher models they mimic, the student models apparently need multiple convolutional layers to learn functions of comparable accuracy.

View on arXiv
Comments on this paper