This paper quantitatively characterizes the approximation power of deep
feed-forward neural networks (FNNs) in terms of the number of neurons. It is
shown by construction that ReLU FNNs with width O(max{d⌊N1/d⌋,N+1}) and depth O(L) can approximate an
arbitrary H\"older continuous function of order α∈(0,1] on [0,1]d
with a nearly tight approximation rate O(dN−2α/dL−2α/d) measured in Lp-norm for any N,L∈N+ and p∈[1,∞]. More generally for an arbitrary continuous
function f on [0,1]d with a modulus of continuity ωf(⋅), the
constructive approximation rate is O(dωf(N−2/dL−2/d)). We also extend our analysis to f on irregular
domains or those localized in an ε-neighborhood of a
dM-dimensional smooth manifold M⊆[0,1]d
with dM≪d. Especially, in the case of an essentially
low-dimensional domain, we show an approximation rate
O(ωf(1−δεdδd+ε)+dωf((1−δ)dδdN−2/dδL−2/dδ))
for ReLU FNNs to approximate f in the ε-neighborhood, where
dδ=O(dMδ2ln(d/δ))
for any δ∈(0,1) as a relative error for a projection to approximate an
isometry when projecting M to a dδ-dimensional domain.