Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Fundamentals of Quantile Regression
- 3 Inference for Quantile Regression
- 4 Asymptotic Theory of Quantile Regression
- 5 L-Statistics and Weighted Quantile Regression
- 6 Computational Aspects of Quantile Regression
- 7 Nonparametric Quantile Regression
- 8 Twilight Zone of Quantile Regression
- 9 Conclusion
- A Quantile Regression in R: A Vignette
- B Asymptotic Critical Values
- References
- Name Index
- Subject Index
1 - Introduction
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Fundamentals of Quantile Regression
- 3 Inference for Quantile Regression
- 4 Asymptotic Theory of Quantile Regression
- 5 L-Statistics and Weighted Quantile Regression
- 6 Computational Aspects of Quantile Regression
- 7 Nonparametric Quantile Regression
- 8 Twilight Zone of Quantile Regression
- 9 Conclusion
- A Quantile Regression in R: A Vignette
- B Asymptotic Critical Values
- References
- Name Index
- Subject Index
Summary
MEANS AND ENDS
Much of applied statistics may be viewed as an elaboration of the linear regression model and associated estimation methods of least squares. In beginning to describe these techniques, Mosteller and Tukey (1977), in their influential text, remark:
What the regression curve does is give a grand summary for the averages of the distributions corresponding to the set of xs. We could go further and compute several different regression curves corresponding to the various percentage points of the distributions and thus get a more complete picture of the set. Ordinarily this is not done, and so regression often gives a rather incomplete picture. Just as the mean gives an incomplete picture of a single distribution, so the regression curve gives a correspondingly incomplete picture for a set of distributions.
My objective in the following pages is to describe explicitly how to “go further.” Quantile regression is intended to offer a comprehensive strategy for completing the regression picture.
Why does least-squares estimation of the linear regression model so pervade applied statistics? What makes it such a successful tool? Three possible answers suggest themselves. One should not discount the obvious fact that the computational tractability of linear estimators is extremely appealing. Surely this was the initial impetus for their success. Second, if observational noise is normally distributed (i.e., Gaussian), least-squares methods are known to enjoy a certain optimality. But, as it was for Gauss himself, this answer often appears to be an ex post rationalization designed to replace the first response.
- Type
- Chapter
- Information
- Quantile Regression , pp. 1 - 25Publisher: Cambridge University PressPrint publication year: 2005