Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Part One Machine Learning
- Part Two Optimal Recovery
- Executive Summary
- 9 Foundational Results of Optimal Recovery
- 10 Approximability Models
- 11 Ideal Selection of Observation Schemes
- 12 Curse of Dimensionality
- 13 Quasi-Monte Carlo Integration
- Part Three Compressive Sensing
- Part Four Optimization
- Part Five Neural Networks
- Appendices
- References
- Index
9 - Foundational Results of Optimal Recovery
from Part Two - Optimal Recovery
Published online by Cambridge University Press: 21 April 2022
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Part One Machine Learning
- Part Two Optimal Recovery
- Executive Summary
- 9 Foundational Results of Optimal Recovery
- 10 Approximability Models
- 11 Ideal Selection of Observation Schemes
- 12 Curse of Dimensionality
- 13 Quasi-Monte Carlo Integration
- Part Three Compressive Sensing
- Part Four Optimization
- Part Five Neural Networks
- Appendices
- References
- Index
Summary
This chapter introduces the key concepts of optimal recovery, such as model sets, quantities of interest, intrinsic errors, and optimal recovery maps. It emphasizes two situations where recovery maps that are optimal over symmetric and convex model sets can be chosen as linear maps: the estimation of linear functionals and the Hilbert setting. Natural splines are introduced as a concrete example of the latter situation.
- Type
- Chapter
- Information
- Mathematical Pictures at a Data Science Exhibition , pp. 68 - 75Publisher: Cambridge University PressPrint publication year: 2022