ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.01753
  4. Cited By
How deep is deep enough? -- Quantifying class separability in the hidden
  layers of deep neural networks

How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks

5 November 2018
Junhong Lin
C. Metzner
Andreas K. Maier
Volkan Cevher
Holger Schulze
Patrick Krauss
ArXivPDFHTML

Papers citing "How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks"

4 / 4 papers shown
Title
SCHEME: Scalable Channel Mixer for Vision Transformers
SCHEME: Scalable Channel Mixer for Vision Transformers
Deepak Sridhar
Yunsheng Li
Nuno Vasconcelos
81
0
0
01 Dec 2023
Decision support from financial disclosures with deep neural networks
  and transfer learning
Decision support from financial disclosures with deep neural networks and transfer learning
Mathias Kraus
Stefan Feuerriegel
AIFin
67
262
0
11 Oct 2017
Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning
  Algorithms
Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms
Han Xiao
Kashif Rasul
Roland Vollgraf
276
8,878
0
25 Aug 2017
Learning Transferable Architectures for Scalable Image Recognition
Learning Transferable Architectures for Scalable Image Recognition
Barret Zoph
Vijay Vasudevan
Jonathon Shlens
Quoc V. Le
172
5,596
0
21 Jul 2017
1