
Regularization Theory for Ill-posed Problems : Selected Topics.
Title:
Regularization Theory for Ill-posed Problems : Selected Topics.
Author:
Lu, Shuai.
ISBN:
9783110286496
Personal Author:
Physical Description:
1 online resource (289 pages)
Series:
Inverse and Ill-Posed Problems Series ; v.58
Inverse and Ill-Posed Problems Series
Contents:
Preface -- 1 An introduction using classical examples -- 1.1 Numerical differentiation. First look at the problem of regularization. The balancing principle -- 1.1.1 Finite-difference formulae -- 1.1.2 Finite-difference formulae for nonexact data. A priori choice of the stepsize -- 1.1.3 A posteriori choice of the stepsize -- 1.1.4 Numerical illustration -- 1.1.5 The balancing principle in a general framework -- 1.2 Stable summation of orthogonal series with noisy coefficients. Deterministic and stochastic noise models. Description of smoothness properties -- 1.2.1 Summation methods -- 1.2.2 Deterministic noise model -- 1.2.3 Stochastic noise model -- 1.2.4 Smoothness associated with a basis -- 1.2.5 Approximation and stability properties of -methods -- 1.2.6 Error bounds -- 1.3 The elliptic Cauchy problem and regularization by discretization -- 1.3.1 Natural linearization of the elliptic Cauchy problem -- 1.3.2 Regularization by discretization -- 1.3.3 Application in detecting corrosion -- 2 Basics of single parameter regularization schemes -- 2.1 Simple example for motivation -- 2.2 Essentially ill-posed linear operator equations. Least-squares solution. General view on regularization -- 2.3 Smoothness in the context of the problem. Benchmark accuracy levels for deterministic and stochastic data noise models -- 2.3.1 The best possible accuracy for the deterministic noise model -- 2.3.2 The best possible accuracy for the Gaussian white noise model -- 2.4 Optimal order and the saturation of regularization methods in Hilbert spaces -- 2.5 Changing the penalty term for variance reduction. Regularization in Hilbert scales -- 2.6 Estimation of linear functionals from indirect noisy observations -- 2.7 Regularization by finite-dimensional approximation.
2.8 Model selection based on indirect observation in Gaussian white noise -- 2.8.1 Linear models given by least-squares methods -- 2.8.2 Operator monotone functions -- 2.8.3 The problem of model selection (continuation) -- 2.9 A warning example: an operator equation formulation is not always adequate (numerical differentiation revisited) -- 2.9.1 Numerical differentiation in variable Hilbert scales associated with designs -- 2.9.2 Error bounds in L2 -- 2.9.3 Adaptation to the unknown bound of the approximation error -- 2.9.4 Numerical differentiation in the space of continuous functions -- 2.9.5 Relation to the Savitzky-Golay method. Numerical examples -- 3 Multiparameter regularization -- 3.1 When do we really need multiparameter regularization? -- 3.2 Multiparameter discrepancy principle -- 3.2.1 Model function based on the multiparameter discrepancy principle -- 3.2.2 A use of the model function to approximate one set of parameters satisfying the discrepancy principle -- 3.2.3 Properties of the model function approximation -- 3.2.4 Discrepancy curve and the convergence analysis -- 3.2.5 Heuristic algorithm for the model function approximation of the multiparameter discrepancy principle -- 3.2.6 Generalization in the case of more than two regularization parameters -- 3.3 Numerical realization and testing -- 3.3.1 Numerical examples and comparison -- 3.3.2 Two-parameter discrepancy curve -- 3.3.3 A numerical check of Proposition 3.1 and use of a discrepancy curve -- 3.3.4 Experiments with three-parameter regularization -- 3.4 Two-parameter regularization with one negative parameter for problems with noisy operators and right-hand side -- 3.4.1 Computational aspects for regularized total least squares -- 3.4.2 Computational aspects for dual regularized total least squares.
3.4.3 Error bounds in the case B D I -- 3.4.4 Error bounds for B ¤ I -- 3.4.5 Numerical illustrations. Model function approximation in dual regularized total least squares -- 4 Regularization algorithms in learning theory -- 4.1 Supervised learning problem as an operator equation in a reproducing kernel Hilbert space (RKHS) -- 4.1.1 Reproducing kernel Hilbert spaces and related operators -- 4.1.2 A priori assumption on the problem: general source conditions -- 4.2 Kernel independent learning rates -- 4.2.1 Regularization for binary classification: risk bounds and Bayes consistency -- 4.3 Adaptive kernel methods using the balancing principle -- 4.3.1 Adaptive learning when the error measure is known -- 4.3.2 Adaptive learning when the error measure is unknown -- 4.3.3 Proofs of Propositions 4.6 and 4.7 -- 4.3.4 Numerical experiments. Quasibalancing principle -- 4.4 Kernel adaptive regularization with application to blood glucose reading -- 4.4.1 Reading the blood glucose level from subcutaneous electric currentmeasurements -- 4.5 Multiparameter regularization in learning theory -- 5 Meta-learning approach to regularization - case study: blood glucose prediction -- 5.1 A brief introduction to meta-learning and blood glucose prediction -- 5.2 A traditional learning theory approach: issues and concerns -- 5.3 Meta-learning approach to choosing a kernel and a regularization parameter -- 5.3.1 Optimization operation -- 5.3.2 Heuristic operation -- 5.3.3 Learning at metalevel -- 5.4 Case-study: blood glucose prediction -- Bibliography -- Index.
Abstract:
Thismonograph is a valuable contribution to thehighly topical and extremly productive field ofregularisationmethods for inverse and ill-posed problems. The author is an internationally outstanding and acceptedmathematicianin this field. In his book he offers a well-balanced mixtureof basic and innovative aspects.He demonstrates new,differentiatedviewpoints, and important examples for applications. The bookdemontrates thecurrent developments inthe field of regularization theory,such as multiparameter regularization and regularization in learning theory. The book is written for graduate and PhDstudents and researchersin mathematics, natural sciences, engeneering, and medicine.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Genre:
Added Author:
Electronic Access:
Click to View