Book contents
- Frontmatter
- Dedication
- Contents
- 1 Introduction
- 2 Univariate density estimation
- 3 Multivariate density estimation
- 4 Inference about the density
- 5 Regression
- 6 Testing in regression
- 7 Smoothing discrete variables
- 8 Regression with discrete covariates
- 9 Semiparametric methods
- 10 Instrumental variables
- 11 Panel data
- 12 Constrained estimation and inference
- Bibliography
- Index
5 - Regression
Published online by Cambridge University Press: 05 February 2015
- Frontmatter
- Dedication
- Contents
- 1 Introduction
- 2 Univariate density estimation
- 3 Multivariate density estimation
- 4 Inference about the density
- 5 Regression
- 6 Testing in regression
- 7 Smoothing discrete variables
- 8 Regression with discrete covariates
- 9 Semiparametric methods
- 10 Instrumental variables
- 11 Panel data
- 12 Constrained estimation and inference
- Bibliography
- Index
Summary
Regression is the backbone of applied econometric research. Although regression is widespread, the vast majority of economic research assumes that regressors enter the conditional mean linearly and that each regressor is separable without any theoretical justification. Here were discuss how to estimate regression functions where we are unsure of the underlying functional form.
The nonparametric regression estimators that we will describe in this chapter will construct an estimate of the unknown function in much the same way that we constructed the unknown density: by using a local sample for each point. Whereas parametric estimators are considered global estimators (using all data points), non-parametric kernel regression estimators are local estimators, using a local sample of nearby data points to fit a specific parametric model (typically a constant or a line) and then “smooth” each of these local fits to construct the global function estimator. This allows you to focus on the local peculiarities inherent in your data set while estimating the unknown function without judicious choice of parametric functional form.
We first motivate regression through explanation of the conditional mean via its connection to conditional and joint densities. Then, similar to our discussion of the construction of kernel densities, we also consider a simplistic (nonparametric) estimator of the conditional mean. We choose a method that uses indicator functions to evaluate the conditional mean at various points. Specifically, we calculate the average value of the dependent variable at specific values of the covariates. This method is crude, but does not require us to specify the functional form a priori. As we also saw with density estimation, we will use our crude estimator to gain intuition and motivate kernel estimators.
Within kernel regression, there are several existing estimators which seek to estimate the unknown smooth function of interest. Here we will primarily discuss three of the most popular methods. The oldest, local-constant least-squares (LCLS), has seen less use in recent years, but is still being used, despite its shortcomings.
- Type
- Chapter
- Information
- Applied Nonparametric Econometrics , pp. 113 - 158Publisher: Cambridge University PressPrint publication year: 2015
- 1
- Cited by