Convergence rates for variational regularization of inverse problems in exponential families
by Simayi Yusufu
Date of Examination:2019-09-12
Date of issue:2020-07-16
Advisor:Prof. Dr. Thorsten Hohage
Referee:Prof. Dr. Thorsten Hohage
Referee:Prof. Dr. Axel Munk
Files in this item
Name:main.pdf
Size:1.02Mb
Format:PDF
Description:Dissertation
Abstract
English
We consider statistical inverse problems with statistical noise. By using regularization methods one can approximate the true solution of the inverse problem by a regularized solution. The previous investigation of convergence rates for variational regularization with Poisson and empirical process data is shown to be suboptimal. In this thesis we obtain improved convergence rates for variational regularization methods of nonlinear ill-posed inverse problems with certain stochastic noise models described by exponential families and derive better reconstruction error bounds by applying deviation inequalities for stochastic process in some function spaces. Furthermore, we also consider iteratively regularized Newton-method as an alternative while the operator is non-linear. Due to the difficulty of deriving suitable deviation inequalities for stochastic processes in some function spaces, we are currently not able to obtain optimal convergence rates for variational regularization such that we state our desired result as a conjecture. If our conjecture holds true, then we can immediately obtain our desired results.
Keywords: Convergence rates; Variational regularization; Empirical process data; Poisson data; Gaussian white noise; Iteratively regularized Newton-type method; Inverse problems