In deep learning (DL) the instability phenomenon is widespread and well documented, most commonly using the classical measure of stability, the Lipschitz constant. While a small Lipchitz constant is traditionally viewed as guarantying stability, it does not capture the instability phenomenon in DL for classification well. The reason is that a classification function -- which is the target function to be approximated -- is necessarily discontinuous, thus having an ínfinite' Lipchitz constant. As a result, the classical approach will deem every classification function unstable, yet basic classification functions a la ís there a cat in the image?' will typically be locally very 'flat' -- and thus locally stable -- except at the decision boundary. The lack of an appropriate measure of stability hinders a rigorous theory for stability in DL, and consequently, there are no proper approximation theoretic results that can guarantee the existence of stable networks for classification functions. In this paper we introduce a novel stability measure , for any classification function , appropriate to study the stability of discontinuous functions and their approximations. We further prove two approximation theorems: First, for any and any classification function on a \emph{compact set}, there is a neural network (NN) , such that only on a set of measure , moreover, (as accurate and stable as up to ). Second, for any classification function and , there exists a NN such that on the set of points that are at least away from the decision boundary.
View on arXiv