Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- PART ONE PROBABILITY IN ACTION
- PART TWO ESSENTIALS OF PROBABILITY
- 7 Foundations of probability theory
- 8 Conditional probability and Bayes
- 9 Basic rules for discrete random variables
- 10 Continuous random variables
- 11 Jointly distributed random variables
- 12 Multivariate normal distribution
- 13 Conditioning by random variables
- 14 Generating functions
- 15 Discrete-time Markov chains
- 16 Continuous-time Markov chains
- Appendix: Counting methods and ex
- Recommended reading
- Answers to odd-numbered problems
- Bibliography
- Index
13 - Conditioning by random variables
Published online by Cambridge University Press: 05 August 2012
- Frontmatter
- Contents
- Preface
- Introduction
- PART ONE PROBABILITY IN ACTION
- PART TWO ESSENTIALS OF PROBABILITY
- 7 Foundations of probability theory
- 8 Conditional probability and Bayes
- 9 Basic rules for discrete random variables
- 10 Continuous random variables
- 11 Jointly distributed random variables
- 12 Multivariate normal distribution
- 13 Conditioning by random variables
- 14 Generating functions
- 15 Discrete-time Markov chains
- 16 Continuous-time Markov chains
- Appendix: Counting methods and ex
- Recommended reading
- Answers to odd-numbered problems
- Bibliography
- Index
Summary
In Chapter 8, conditional probabilities are introduced by conditioning upon the occurrence of an event B of nonzero probability. In applications, this event B is often of the form Y = b for a discrete random variable Y. However, when the random variable Y is continuous, the condition Y = b has probability zero for any number b. The purpose of this chapter is to develop techniques for handling a condition provided by the observed value of a continuous random variable. We will see that the conditional probability density function of X given Y = b for continuous random variables is analogous to the conditional probability mass function of X given Y = b for discrete random variables. The conditional distribution of X given Y = b enables us to define the natural concept of conditional expectation of X given Y = b. This concept allows for an intuitive understanding and is of utmost importance. In statistical applications, it is often more convenient to work with conditional expectations instead of the correlation coefficient when measuring the strength of the relationship between two dependent random variables. In applied probability problems, the computation of the expected value of a random variable X is often greatly simplified by conditioning on an appropriately chosen random variable Y. Learning the value of Y provides additional information about the random variable X and for that reason the computation of the conditional expectation of X given Y = b is often simple.
- Type
- Chapter
- Information
- Understanding Probability , pp. 404 - 434Publisher: Cambridge University PressPrint publication year: 2012