Zur Kurzanzeige

Variational Estimators in Statistical Multiscale Analysis

dc.contributor.advisorMunk, Axel Prof. Dr.
dc.contributor.authorLi, Housen
dc.date.accessioned2016-05-24T08:51:41Z
dc.date.available2016-05-24T08:51:41Z
dc.date.issued2016-05-24
dc.identifier.urihttp://hdl.handle.net/11858/00-1735-0000-0028-875F-F
dc.identifier.urihttp://dx.doi.org/10.53846/goediss-5657
dc.language.isoengde
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject.ddc510de
dc.titleVariational Estimators in Statistical Multiscale Analysisde
dc.typedoctoralThesisde
dc.contributor.refereeMunk, Axel Prof. Dr.
dc.date.examination2016-02-17
dc.description.abstractengIn recent years, a novel type of multiscale variational statistical approaches, based on so-called multiscale statistics, have received increasing popularity in various applications, such as signal recovery, imaging and image processing, mainly because they in general perform uniformly well over a range of different scales (i.e. sizes of features). By contrast, the underlying statistical theory for these methods is still lacking, in particular with regard to the asymptotic convergence behavior. For the sake of narrowing such gap, we propose and analyze a constrained variational approach, which we call MultIscale Nemirovski-Dantzig (MIND) estimator, for recovering smooth functions in the settings of nonparametric regression and statistical inverse problems. It can be viewed as a multiscale extension of the Dantzig selector (Ann. Statist., 35(6): 2313–51, 2009) based on early ideas of Nemirovski (J. Comput. System Sci., 23:1–11, 1986). To be precise, MIND minimizes a homogeneous Sobolev norm under the constraint that the multiresolution norm of the residual is bounded by a universal threshold.  The main contribution of this work is the derivation of convergence rates of MIND both almost surely and in expectation for nonparametric regression and linear statistical inverse problems. To this end, we generalize the Nemirovski’s interpolation inequality for the multiresolution norm and Sobolev norms, and introduce the method of approximate source conditions to our statistical setting. Based on these tools, we are able to obtain certain convergence rates under abstract smoothness assumptions about the truth. For a one-dimensional signal, such assumptions can be translated into classical smoothness classes and source sets by means of the approximation properties of B-splines. As a consequence, MIND attains almost minimax optimal rates simultaneously over a large range of Sobolev and Besov classes, for nonparametric regression of functions and their derivatives. Analogous results have been also obtained for certain linear statistical inverse problems, such as deconvolution if the Fourier coefficients of the convolution kernel is of polynomial decay. Put differently, these results reveal that MIND possesses certain adaptation to the smoothness of the underlying true signal. In parallel, we have presented a similar analysis for a penalized version of MIND, and its parameter choice via the Lepskii balancing principle. Finally, complimentary to the asymptotic analysis, we examine the finite sample performance of MIND by various numerical simulations.de
dc.contributor.coRefereeHaltmeier, Markus Prof. Dr.
dc.contributor.thirdRefereeAspelmeier, Timo Dr.
dc.contributor.thirdRefereeBahns, Dorothea Prof. Dr.
dc.contributor.thirdRefereeKrivobokova, Tatyana Prof. Dr.
dc.contributor.thirdRefereeWardetzky, Max Prof. Dr.
dc.subject.engnonparametric regressionde
dc.subject.engstatistical inverse problemsde
dc.subject.engadaptationde
dc.subject.engmultiresolution normde
dc.subject.engconvergence ratesde
dc.subject.engapproximate source conditionsde
dc.identifier.urnurn:nbn:de:gbv:7-11858/00-1735-0000-0028-875F-F-4
dc.affiliation.instituteFakultät für Mathematik und Informatikde
dc.subject.gokfullMathematik (PPN61756535X)de
dc.identifier.ppn859767930


Dateien

Thumbnail

Das Dokument erscheint in:

Zur Kurzanzeige