TALK POSTPONED! (new date will be announced at a later date)
We derive optimal rates of convergence in the supremum norm for estimating the Hölder-smooth mean function as well as the covariance kernel of a stochastic process which is repeatedly and discretely observed with additional errors at fixed, synchronous design points, the typical scenario for machine recorded functional data.
Similarly to the optimal rates in L_2 for the mean function obtained in Cai and Yuan (2011), for sparse design a discretization term dominates, while in the dense case the parametric root-n rate can be achieved as if the n processes were continuously observed without errors.
The supremum norm is of practical interest since it corresponds to the visualization of the estimation error, and forms the basis for the construction uniform confidence bands. We show that in contrast to the analysis in L_2, there is an intermediate regime between the sparse and dense cases dominated by the contribution of the observation errors. Furthermore, under the supremum norm interpolation estimators for the mean which suffice in L_2 turn out to be sub-optimal in the dense setting, which helps to explain their poor empirical performance.
For the covariance kernel we devise estimators which make use of higher-order smoothness away from the diagonal without requiring the same smoothness on the diagonal, and thus are able to cover processes with relatively rough sample paths.
We also obtain a central limit theorems in the supremum norm, and provide simulations and real data applications to illustrate our results.
Underlying paper: https://arxiv.org/abs/2306.04550
Personal website of Hajo Holzmann