Variational Estimators in Statistical Multiscale Analysis
by Housen Li
Date of Examination:2016-02-17
Date of issue:2016-05-24
Advisor:Prof. Dr. Axel Munk
Referee:Prof. Dr. Axel Munk
Referee:Prof. Dr. Markus Haltmeier
Referee:Dr. Timo Aspelmeier
Referee:Prof. Dr. Dorothea Bahns
Referee:Prof. Dr. Tatyana Krivobokova
Referee:Prof. Dr. Max Wardetzky
Files in this item
Name:thesis_hli.pdf
Size:7.16Mb
Format:PDF
Abstract
English
In recent years, a novel type of multiscale variational statistical approaches, based on so-called multiscale statistics, have received increasing popularity in various applications, such as signal recovery, imaging and image processing, mainly because they in general perform uniformly well over a range of different scales (i.e. sizes of features). By contrast, the underlying statistical theory for these methods is still lacking, in particular with regard to the asymptotic convergence behavior. For the sake of narrowing such gap, we propose and analyze a constrained variational approach, which we call MultIscale Nemirovski-Dantzig (MIND) estimator, for recovering smooth functions in the settings of nonparametric regression and statistical inverse problems. It can be viewed as a multiscale extension of the Dantzig selector (Ann. Statist., 35(6): 2313–51, 2009) based on early ideas of Nemirovski (J. Comput. System Sci., 23:1–11, 1986). To be precise, MIND minimizes a homogeneous Sobolev norm under the constraint that the multiresolution norm of the residual is bounded by a universal threshold. The main contribution of this work is the derivation of convergence rates of MIND both almost surely and in expectation for nonparametric regression and linear statistical inverse problems. To this end, we generalize the Nemirovski’s interpolation inequality for the multiresolution norm and Sobolev norms, and introduce the method of approximate source conditions to our statistical setting. Based on these tools, we are able to obtain certain convergence rates under abstract smoothness assumptions about the truth. For a one-dimensional signal, such assumptions can be translated into classical smoothness classes and source sets by means of the approximation properties of B-splines. As a consequence, MIND attains almost minimax optimal rates simultaneously over a large range of Sobolev and Besov classes, for nonparametric regression of functions and their derivatives. Analogous results have been also obtained for certain linear statistical inverse problems, such as deconvolution if the Fourier coefficients of the convolution kernel is of polynomial decay. Put differently, these results reveal that MIND possesses certain adaptation to the smoothness of the underlying true signal. In parallel, we have presented a similar analysis for a penalized version of MIND, and its parameter choice via the Lepskii balancing principle. Finally, complimentary to the asymptotic analysis, we examine the finite sample performance of MIND by various numerical simulations.
Keywords: nonparametric regression; statistical inverse problems; adaptation; multiresolution norm; convergence rates; approximate source conditions