ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.08679
  4. Cited By
Feature Space Saturation during Training
v1v2v3v4v5 (latest)

Feature Space Saturation during Training

15 June 2020
Mats L. Richter
Justin Shenk
Wolf Byttner
Anders Arpteg
Mikael Huss
    FAtt
ArXiv (abs)PDFHTML

Papers citing "Feature Space Saturation during Training"

4 / 4 papers shown
Title
Receptive Field Refinement for Convolutional Neural Networks Reliably
  Improves Predictive Performance
Receptive Field Refinement for Convolutional Neural Networks Reliably Improves Predictive Performance
Mats L. Richter
C. Pal
70
3
0
26 Nov 2022
Should You Go Deeper? Optimizing Convolutional Neural Network
  Architectures without Training by Receptive Field Analysis
Should You Go Deeper? Optimizing Convolutional Neural Network Architectures without Training by Receptive Field Analysis
Mats L. Richter
Julius Schöning
Anna Wiedenroth
U. Krumnack
24
16
0
23 Jun 2021
Exploring the Properties and Evolution of Neural Network Eigenspaces
  during Training
Exploring the Properties and Evolution of Neural Network Eigenspaces during Training
Mats L. Richter
Leila Malihi
Anne-Kathrin Patricia Windler
U. Krumnack
11
2
0
17 Jun 2021
Size Matters
Size Matters
Mats L. Richter
Wolf Byttner
U. Krumnack
Ludwig Schallner
Justin Shenk
29
1
0
02 Feb 2021
1