ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.07418
  4. Cited By
Grassmannian Packings in Neural Networks: Learning with Maximal Subspace
  Packings for Diversity and Anti-Sparsity

Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity

18 November 2019
Dian Ang Yap
Nicholas Roberts
Vinay Uday Prabhu
ArXivPDFHTML

Papers citing "Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity"

5 / 5 papers shown
Title
On Implicit Filter Level Sparsity in Convolutional Neural Networks
On Implicit Filter Level Sparsity in Convolutional Neural Networks
Dushyant Mehta
K. Kim
Christian Theobalt
37
27
0
29 Nov 2018
Collapse of Deep and Narrow Neural Nets
Collapse of Deep and Narrow Neural Nets
Lu Lu
Yanhui Su
George Karniadakis
ODL
44
155
0
15 Aug 2018
Pruning Filters for Efficient ConvNets
Pruning Filters for Efficient ConvNets
Hao Li
Asim Kadav
Igor Durdanovic
H. Samet
H. Graf
3DPC
151
3,676
0
31 Aug 2016
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained
  Quantization and Huffman Coding
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
172
8,793
0
01 Oct 2015
Visualizing and Understanding Convolutional Networks
Visualizing and Understanding Convolutional Networks
Matthew D. Zeiler
Rob Fergus
FAtt
SSL
189
15,825
0
12 Nov 2013
1