Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.12307
Cited By
v1
v2 (latest)
Should You Go Deeper? Optimizing Convolutional Neural Network Architectures without Training by Receptive Field Analysis
23 June 2021
Mats L. Richter
Julius Schöning
Anna Wiedenroth
U. Krumnack
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Should You Go Deeper? Optimizing Convolutional Neural Network Architectures without Training by Receptive Field Analysis"
7 / 7 papers shown
Title
Convolution and Attention-Free Mamba-based Cardiac Image Segmentation
Abbas Khan
Muhammad Asad
Martin Benning
C. Roney
Gregory Slabaugh
Mamba
71
4
0
09 Jun 2024
Your Image is My Video: Reshaping the Receptive Field via Image-To-Video Differentiable AutoAugmentation and Fusion
S. Casarin
C. Ugwu
Sergio Escalera
Oswald Lanz
98
0
0
22 Mar 2024
Wuerstchen: An Efficient Architecture for Large-Scale Text-to-Image Diffusion Models
Pablo Pernias
Dominic Rampas
Mats L. Richter
Christopher Pal
Marc Aubreville
DiffM
VLM
111
45
0
01 Jun 2023
AI-in-the-Loop -- The impact of HMI in AI-based Application
Julius Schöning
C. Westerkamp
43
4
0
21 Mar 2023
Receptive Field Refinement for Convolutional Neural Networks Reliably Improves Predictive Performance
Mats L. Richter
C. Pal
70
3
0
26 Nov 2022
AI for Closed-Loop Control Systems -- New Opportunities for Modeling, Designing, and Tuning Control Systems
Julius Schöning
A. Riechmann
H. Pfisterer
AI4CE
24
14
0
18 Jan 2022
Feature Space Saturation during Training
Mats L. Richter
Justin Shenk
Wolf Byttner
Anders Arpteg
Mikael Huss
FAtt
49
6
0
15 Jun 2020
1