This paper studies an asymptotic framework for conducting inference on parameters of the form , where is a known directionally differentiable function and is estimated by . In these settings, the asymptotic distribution of the plug-in estimator can be readily derived employing existing extensions to the Delta method. We show, however, that the "standard" bootstrap is only consistent under overly stringent conditions -- in particular we establish that differentiability of is a necessary and sufficient condition for bootstrap consistency whenever the limiting distribution of is Gaussian. An alternative resampling scheme is proposed which remains consistent when the bootstrap fails, and is shown to provide local size control under restrictions on the directional derivative of . We illustrate the utility of our results by developing a test of whether a Hilbert space valued parameter belongs to a convex set -- a setting that includes moment inequality problems and certain tests of shape restrictions as special cases.
View on arXiv