Book contents
- Frontmatter
- Contents
- Preface
- PART I INTRODUCTION
- PART II ASSIGNING PROBABILITIES
- PART III PARAMETER ESTIMATION
- PART IV TESTING HYPOTHESES
- PART V REAL-WORLD APPLICATIONS
- 21 Regression
- 22 Consistent inference on inconsistent data
- 23 Unrecognized signal contributions
- 24 Change point problems
- 25 Function estimation
- 26 Integral equations
- 27 Model selection
- 28 Bayesian experimental design
- PART VI PROBABILISTIC NUMERICAL TECHNIQUES
- Appendix A Mathematical compendium
- Appendix B Selected proofs and derivations
- Appendix C Symbols and notation
- References
- Index
21 - Regression
from PART V - REAL-WORLD APPLICATIONS
Published online by Cambridge University Press: 05 July 2014
- Frontmatter
- Contents
- Preface
- PART I INTRODUCTION
- PART II ASSIGNING PROBABILITIES
- PART III PARAMETER ESTIMATION
- PART IV TESTING HYPOTHESES
- PART V REAL-WORLD APPLICATIONS
- 21 Regression
- 22 Consistent inference on inconsistent data
- 23 Unrecognized signal contributions
- 24 Change point problems
- 25 Function estimation
- 26 Integral equations
- 27 Model selection
- 28 Bayesian experimental design
- PART VI PROBABILISTIC NUMERICAL TECHNIQUES
- Appendix A Mathematical compendium
- Appendix B Selected proofs and derivations
- Appendix C Symbols and notation
- References
- Index
Summary
Regression is a technique for describing how a response variable y varies with the values of so-called input variables x. There is a distinction between ‘simple regression’, where we have only one input variable x, and ‘multiple regression’, with many input variables x. Predictions are based on a model function y = f(x∣a) that depends on the model parameters a. At the heart of the regression analysis lies the determination of the parameters a, either because they bear a direct (physical) meaning or because they are used along with the model function to make predictions. The reader not familiar with the general ideas of parameter estimation may want to read Part III [p. 227] first. In the literature on frequentist statistics, regression analysis is generally based on the assumption that the measured values of the response variables are independently and normally distributed with equal noise levels. Regression analysis in frequentist statistics boils down to fitting the model parameters such that the sum of the squared deviations between model and data is minimized. A widespread application is the linear regression model, where the function f is linear in the variables x and the parameters a.
In the Bayesian framework, there is no need for any restrictions. Here we will deal with the general problem of inferring the parameters a of an arbitrary model function f(x∣a). In order to cover the bulk of applications we will restrict the following studies to Gaussian errors.
- Type
- Chapter
- Information
- Bayesian Probability TheoryApplications in the Physical Sciences, pp. 333 - 363Publisher: Cambridge University PressPrint publication year: 2014