7 - Numerical optimization
Published online by Cambridge University Press: 05 June 2012
Summary
In many areas of statistics and applied mathematics one has to solve the following problem: given a function f(·), which value of x makes f(x) as large or as small as possible?
For example, in financial modeling f(x) might be the expected return from a portfolio, with x being a vector holding the amounts invested in each of a number of possible securities. There might be constraints on x (e.g. the amount to invest must be positive, the total amount invested must be fixed, etc.)
In statistical modeling, we may want to find a set of parameters for a model which minimize the expected prediction errors for the model. Here x would be the parameters and f(·) would be a measure of the prediction error.
Knowing how to do minimization is sufficient. If we want to maximize f(x), we simply change the sign and minimize −f(x). We call both operations “numerical optimization.” Use of derivatives and simple algebra often lead to the solution of such problems, but not nearly always. Because of the wide range of possibilities for functions f(·) and parameters x, this is a rich area of computing.
The golden section search method
The golden section search method is a simple way of finding the minimizer of a single-variable function which has a single minimum on the interval [a, b].
- Type
- Chapter
- Information
- A First Course in Statistical Programming with R , pp. 132 - 157Publisher: Cambridge University PressPrint publication year: 2007