ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.14126
  4. Cited By
Wide and Deep Neural Networks Achieve Optimality for Classification

Wide and Deep Neural Networks Achieve Optimality for Classification

29 April 2022
Adityanarayanan Radhakrishnan
M. Belkin
Caroline Uhler
ArXivPDFHTML

Papers citing "Wide and Deep Neural Networks Achieve Optimality for Classification"

10 / 10 papers shown
Title
Deep Minimax Classifiers for Imbalanced Datasets with a Small Number of Minority Samples
Hansung Choi
Daewon Seo
46
0
0
24 Feb 2025
Overfitting Regimes of Nadaraya-Watson Interpolators
Overfitting Regimes of Nadaraya-Watson Interpolators
Daniel Barzilai
Guy Kornowski
Ohad Shamir
76
0
0
11 Feb 2025
Generalization bounds for regression and classification on adaptive
  covering input domains
Generalization bounds for regression and classification on adaptive covering input domains
Wen-Liang Hwang
31
0
0
29 Jul 2024
Classifying Overlapping Gaussian Mixtures in High Dimensions: From
  Optimal Classifiers to Neural Nets
Classifying Overlapping Gaussian Mixtures in High Dimensions: From Optimal Classifiers to Neural Nets
Khen Cohen
Noam Levi
Yaron Oz
BDL
31
1
0
28 May 2024
A prediction rigidity formalism for low-cost uncertainties in trained
  neural networks
A prediction rigidity formalism for low-cost uncertainties in trained neural networks
Filippo Bigi
Sanggyu Chong
Michele Ceriotti
Federico Grasselli
46
5
0
04 Mar 2024
Universal Consistency of Wide and Deep ReLU Neural Networks and Minimax
  Optimal Convergence Rates for Kolmogorov-Donoho Optimal Function Classes
Universal Consistency of Wide and Deep ReLU Neural Networks and Minimax Optimal Convergence Rates for Kolmogorov-Donoho Optimal Function Classes
Hyunouk Ko
Xiaoming Huo
36
1
0
08 Jan 2024
From Complexity to Clarity: Analytical Expressions of Deep Neural
  Network Weights via Clifford's Geometric Algebra and Convexity
From Complexity to Clarity: Analytical Expressions of Deep Neural Network Weights via Clifford's Geometric Algebra and Convexity
Mert Pilanci
39
2
0
28 Sep 2023
Can predictive models be used for causal inference?
Can predictive models be used for causal inference?
Maximilian Pichler
F. Hartig
OOD
CML
32
3
0
18 Jun 2023
ReLU soothes the NTK condition number and accelerates optimization for
  wide neural networks
ReLU soothes the NTK condition number and accelerates optimization for wide neural networks
Chaoyue Liu
Like Hui
MLT
24
9
0
15 May 2023
Ensemble Multi-Quantiles: Adaptively Flexible Distribution Prediction
  for Uncertainty Quantification
Ensemble Multi-Quantiles: Adaptively Flexible Distribution Prediction for Uncertainty Quantification
Xing Yan
Yonghua Su
Wenxuan Ma
UQCV
41
1
0
26 Nov 2022
1