Book contents
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
16 - Generalized interpolation
Published online by Cambridge University Press: 22 February 2010
- Frontmatter
- Contents
- Preface
- 1 Applications and motivations
- 2 Haar spaces and multivariate polynomials
- 3 Local polynomial reproduction
- 4 Moving least squares
- 5 Auxiliary tools from analysis and measure theory
- 6 Positive definite functions
- 7 Completely monotone functions
- 8 Conditionally positive definite functions
- 9 Compactly supported functions
- 10 Native spaces
- 11 Error estimates for radial basis function interpolation
- 12 Stability
- 13 Optimal recovery
- 14 Data structures
- 15 Numerical methods
- 16 Generalized interpolation
- 17 Interpolation on spheres and other manifolds
- References
- Index
Summary
Up to now we have dealt only with the problem of recovering an unknown function from certain known function values. But sometimes it might be desirable to recover the function also from other types of data. For example, the function's derivatives might be known at certain points, but not the function itself. This becomes interesting if partial differential equations are considered.
In this chapter we deal with a more general problem than those we have discussed so far. Our approach includes in particular collocation and Galerkin methods for solving partial differential equations. But the methods we will derive are at the present time only able to compete with classical methods to a certain extent. In any case, whenever large data sets are considered one has to combine the methods introduced below with the fast-evaluation ideas of Chapter 15.
In this sense, this chapter should be seen as a unified introduction to a general class of recovery problems.
Optimal recovery in Hilbert spaces
We start by generalizing results from Chapters 11 and 13. In this section we restrict ourselves for simplicity to the Hilbert space setting, even though everything works in the case of semi-Hilbert function spaces also.
Let H be a Hilbert space and denote its dual by H*. Suppose that Λ = λ1, … λN ⊆ ℝ are certain given values. Then a generalized recovery problem would seek to find a function s ℝ H such that λj(s) = fj, 1 ≤ j ≤ N. We will call s a generalized interpolant.
- Type
- Chapter
- Information
- Scattered Data Approximation , pp. 289 - 307Publisher: Cambridge University PressPrint publication year: 2004