ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.11400
  4. Cited By
Channel redundancy and overlap in convolutional neural networks with
  channel-wise NNK graphs

Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs

18 October 2021
David Bonet
Antonio Ortega
Javier Ruiz-Hidalgo
Sarath Shekkizhar
    GNN
ArXivPDFHTML

Papers citing "Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs"

17 / 17 papers shown
Title
Channel-Wise Early Stopping without a Validation Set via NNK Polytope
  Interpolation
Channel-Wise Early Stopping without a Validation Set via NNK Polytope Interpolation
David Bonet
Antonio Ortega
Javier Ruiz-Hidalgo
Sarath Shekkizhar
67
16
0
27 Jul 2021
Convolutional Neural Network Pruning with Structural Redundancy
  Reduction
Convolutional Neural Network Pruning with Structural Redundancy Reduction
Zehao Wang
Chengcheng Li
Xiangyang Wang
3DPC
67
159
0
08 Apr 2021
Representing Deep Neural Networks Latent Space Geometries with Graphs
Representing Deep Neural Networks Latent Space Geometries with Graphs
Carlos Lassance
Vincent Gripon
Antonio Ortega
AI4CE
41
15
0
14 Nov 2020
DeepNNK: Explaining deep models and their generalization using polytope
  interpolation
DeepNNK: Explaining deep models and their generalization using polytope interpolation
Sarath Shekkizhar
Antonio Ortega
23
6
0
20 Jul 2020
Designing Network Design Spaces
Designing Network Design Spaces
Ilija Radosavovic
Raj Prateek Kosaraju
Ross B. Girshick
Kaiming He
Piotr Dollár
GNN
102
1,682
0
30 Mar 2020
Orthogonal Convolutional Neural Networks
Orthogonal Convolutional Neural Networks
Jiayun Wang
Yubei Chen
Rudrasis Chakraborty
Stella X. Yu
77
189
0
27 Nov 2019
Deep geometric knowledge distillation with graphs
Deep geometric knowledge distillation with graphs
Carlos Lassance
Myriam Bontonou
G. B. Hacene
Vincent Gripon
Jian Tang
Antonio Ortega
41
39
0
08 Nov 2019
Dimensionality compression and expansion in Deep Neural Networks
Dimensionality compression and expansion in Deep Neural Networks
Stefano Recanatesi
M. Farrell
Madhu S. Advani
Timothy Moore
Guillaume Lajoie
E. Shea-Brown
54
73
0
02 Jun 2019
Intrinsic dimension of data representations in deep neural networks
Intrinsic dimension of data representations in deep neural networks
A. Ansuini
Alessandro Laio
Jakob H. Macke
D. Zoccolan
AI4CE
69
279
0
29 May 2019
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
Mingxing Tan
Quoc V. Le
3DV
MedIm
139
18,134
0
28 May 2019
Dimensionality-Driven Learning with Noisy Labels
Dimensionality-Driven Learning with Noisy Labels
Xingjun Ma
Yisen Wang
Michael E. Houle
Shuo Zhou
S. Erfani
Shutao Xia
S. Wijewickrema
James Bailey
NoLa
73
433
0
07 Jun 2018
Laplacian Networks: Bounding Indicator Function Smoothness for Neural
  Network Robustness
Laplacian Networks: Bounding Indicator Function Smoothness for Neural Network Robustness
Carlos Lassance
Vincent Gripon
Antonio Ortega
AAML
65
16
0
24 May 2018
Characterizing Adversarial Subspaces Using Local Intrinsic
  Dimensionality
Characterizing Adversarial Subspaces Using Local Intrinsic Dimensionality
Xingjun Ma
Yue Liu
Yisen Wang
S. Erfani
S. Wijewickrema
Grant Schoenebeck
D. Song
Michael E. Houle
James Bailey
AAML
111
739
0
08 Jan 2018
Channel Pruning for Accelerating Very Deep Neural Networks
Channel Pruning for Accelerating Very Deep Neural Networks
Yihui He
Xiangyu Zhang
Jian Sun
201
2,525
0
19 Jul 2017
Regularizing CNNs with Locally Constrained Decorrelations
Regularizing CNNs with Locally Constrained Decorrelations
Pau Rodríguez López
Jordi Gonzalez
Guillem Cucurull
J. M. Gonfaus
F. X. Roca
68
133
0
07 Nov 2016
Understanding and Improving Convolutional Neural Networks via
  Concatenated Rectified Linear Units
Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
Wenling Shang
Kihyuk Sohn
Diogo Almeida
Honglak Lee
72
505
0
16 Mar 2016
Speeding up Convolutional Neural Networks with Low Rank Expansions
Speeding up Convolutional Neural Networks with Low Rank Expansions
Max Jaderberg
Andrea Vedaldi
Andrew Zisserman
128
1,462
0
15 May 2014
1