This paper studies an asymptotic framework for conducting inference on parameters of the form φ(θ), where φ is a known directionally differentiable function and θ is estimated by θn. In these settings, the asymptotic distribution of the plug-in estimator φ(θn) can be readily derived employing existing extensions to the Delta method. We show, however, that (full) differentiability of φ is a necessary and sufficient condition for bootstrap consistency whenever the limiting distribution of θn is Gaussian. An alternative resampling scheme is proposed which remains consistent when the bootstrap fails, and is shown to provide local size control under restrictions on the directional derivative of φ. We illustrate the utility of our results by developing a test of whether a Hilbert space valued parameter belongs to a convex set – a setting that includes moment inequality problems, tests of random utility models, and certain tests of shape restrictions as special cases (e.g. tests of monotonicity of the pricing kernel or of parametric conditional quantile model specifications).
Homepage of Andres Santos