ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.05021
  4. Cited By
CovNet: Covariance Networks for Functional Data on Multidimensional
  Domains
v1v2 (latest)

CovNet: Covariance Networks for Functional Data on Multidimensional Domains

11 April 2021
Soham Sarkar
V. Panaretos
ArXiv (abs)PDFHTML

Papers citing "CovNet: Covariance Networks for Functional Data on Multidimensional Domains"

16 / 16 papers shown
Title
Estimation of the Mean Function of Functional Data via Deep Neural
  Networks
Estimation of the Mean Function of Functional Data via Deep Neural Networks
Shuoyang Wang
Guanqun Cao
Zuofeng Shang
47
21
0
08 Dec 2020
Approximating smooth functions by deep neural networks with sigmoid
  activation function
Approximating smooth functions by deep neural networks with sigmoid activation function
S. Langer
44
68
0
08 Oct 2020
Quantifying deviations from separability in space-time functional
  processes
Quantifying deviations from separability in space-time functional processes
Holger Dette
Gauthier Dierickx
T. Kutta
14
9
0
26 Mar 2020
Smooth function approximation by deep neural networks with general
  activation functions
Smooth function approximation by deep neural networks with general activation functions
Ilsang Ohn
Yongdai Kim
40
80
0
17 Jun 2019
Simultaneous Confidence Bands for Functional Data Using the Gaussian
  Kinematic Formula
Simultaneous Confidence Bands for Functional Data Using the Gaussian Kinematic Formula
F. Telschow
Armin Schwartzman
10
20
0
18 Jan 2019
A Test for Separability in Covariance Operators of Random Surfaces
A Test for Separability in Covariance Operators of Random Surfaces
Pramita Bagchi
Holger Dette
22
14
0
23 Oct 2017
Nonparametric regression using deep neural networks with ReLU activation
  function
Nonparametric regression using deep neural networks with ReLU activation function
Johannes Schmidt-Hieber
232
815
0
22 Aug 2017
A representation theorem for stochastic processes with separable
  covariance functions, and its implications for emulation
A representation theorem for stochastic processes with separable covariance functions, and its implications for emulation
J. Rougier
13
16
0
18 Feb 2017
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of
  Dimensionality: a Review
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review
T. Poggio
H. Mhaskar
Lorenzo Rosasco
Brando Miranda
Q. Liao
126
576
0
02 Nov 2016
Why Deep Neural Networks for Function Approximation?
Why Deep Neural Networks for Function Approximation?
Shiyu Liang
R. Srikant
135
385
0
13 Oct 2016
The Power of Depth for Feedforward Neural Networks
The Power of Depth for Feedforward Neural Networks
Ronen Eldan
Ohad Shamir
219
732
0
12 Dec 2015
Tests for separability in nonparametric covariance operators of random
  surfaces
Tests for separability in nonparametric covariance operators of random surfaces
J. Aston
D. Pigoli
Shahin Tavakoli
40
65
0
08 May 2015
Automatic differentiation in machine learning: a survey
Automatic differentiation in machine learning: a survey
A. G. Baydin
Barak A. Pearlmutter
Alexey Radul
J. Siskind
PINNAI4CEODL
168
2,816
0
20 Feb 2015
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
2.0K
150,312
0
22 Dec 2014
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
263
1,200
0
30 Nov 2014
Practical recommendations for gradient-based training of deep
  architectures
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DHODL
193
2,201
0
24 Jun 2012
1