Book contents
- Frontmatter
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- 1 Algorithms and Computers
- 2 Computer Arithmetic
- 3 Matrices and Linear Equations
- 4 More Methods for Solving Linear Equations
- 5 Regression Computations
- 6 Eigenproblems
- 7 Functions: Interpolation, Smoothing, and Approximation
- 8 Introduction to Optimization and Nonlinear Equations
- 9 Maximum Likelihood and Nonlinear Regression
- 10 Numerical Integration and Monte Carlo Methods
- 11 Generating Random Variables from Other Distributions
- 12 Statistical Methods for Integration and Monte Carlo
- 13 Markov Chain Monte Carlo Methods
- 14 Sorting and Fast Algorithms
- Author Index
- Subject Index
- References
14 - Sorting and Fast Algorithms
Published online by Cambridge University Press: 01 June 2011
- Frontmatter
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- 1 Algorithms and Computers
- 2 Computer Arithmetic
- 3 Matrices and Linear Equations
- 4 More Methods for Solving Linear Equations
- 5 Regression Computations
- 6 Eigenproblems
- 7 Functions: Interpolation, Smoothing, and Approximation
- 8 Introduction to Optimization and Nonlinear Equations
- 9 Maximum Likelihood and Nonlinear Regression
- 10 Numerical Integration and Monte Carlo Methods
- 11 Generating Random Variables from Other Distributions
- 12 Statistical Methods for Integration and Monte Carlo
- 13 Markov Chain Monte Carlo Methods
- 14 Sorting and Fast Algorithms
- Author Index
- Subject Index
- References
Summary
Introduction
The theme of this chapter is a simple one: there may be better, faster ways of computing something than you may ever have thought of. One of the maxims of computer science is that just a few programs use most of the resources. Early in the history of computing, people recognized that much of the computing resources went into the common task of sorting a list of numbers. If this task could be done faster, then everything could be done better. A concentrated effort on improving sorting algorithms led to several breakthroughs, all following the principle known as “divide and conquer.” In simple terms, to solve a large task, break it into smaller tasks of the same kind. Cleverly done, the resulting algorithm can be more efficient than anyone would have expected, earning the jargon adjective “fast.” The principle of divide and conquer will be discussed in the next section, followed by a discussion of fast algorithms for sorting. Section 14.4 comprises statistical applications of divide and conquer. Another great breakthrough, the fast Fourier transform (FFT), will be discussed in Section 14.5. Using the FFT to compute convolutions will be discussed in Section 14.6, followed by some interesting applications of the FFT to statistics in Section 14.7. This chapter will close with some topics that are important but don't really fit elsewhere: algorithms for constructing permutations and combinations.
- Type
- Chapter
- Information
- Numerical Methods of Statistics , pp. 403 - 438Publisher: Cambridge University PressPrint publication year: 2011
References
- 1
- Cited by