Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Events and Probabilities
- 3 Random Variables, Means and Variances
- 4 Conditioning and Independence
- 5 Generating Functions; and the Central Limit Theorem
- 6 Confidence Intervals for one-parameter models
- 7 Conditional pdfs and multi-parameter Bayesian Statistics
- 8 Linear Models, ANOVA, etc
- 9 Some further Probability
- 10 Quantum Probability and Quantum Computing
- Appendix A Some Prerequisites and Addenda
- Appendix B Discussion of some Selected Exercises
- Appendix C Tables
- Appendix D A small Sample of the Literature
- Bibliography
- Index
9 - Some further Probability
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Events and Probabilities
- 3 Random Variables, Means and Variances
- 4 Conditioning and Independence
- 5 Generating Functions; and the Central Limit Theorem
- 6 Confidence Intervals for one-parameter models
- 7 Conditional pdfs and multi-parameter Bayesian Statistics
- 8 Linear Models, ANOVA, etc
- 9 Some further Probability
- 10 Quantum Probability and Quantum Computing
- Appendix A Some Prerequisites and Addenda
- Appendix B Discussion of some Selected Exercises
- Appendix C Tables
- Appendix D A small Sample of the Literature
- Bibliography
- Index
Summary
In this chapter, we look briefly at three further topics from Probability: Conditional Expectations (CEs); Martingales; Poisson Processes. The first of these leads directly on to the second. The third is somewhat different.
First, we look at the conditional expectation E(X|A) of a Random Variable X given that event A occurs, and at the decomposition
We used these ideas in solving the ‘average wait for patterns’ problems in Chapter 1.
We then examine one of the central breakthroughs in Probability, the definition (due in its most general form largely to Kolmogorov) of E(X|Y), the Random Variable which is the conditional expectation of X given the Random Variable Y. This is a rather subtle concept, for which we shall later see several motivating examples. To get the idea: if X and Y are the scores on two successive throws of a fair die, then E(x + Y|Y) = 3½ + Y. Here, E(X|Y) = E(X) = 3½because, since X is independent of Y, knowing the value of Y does not change our view of the expectation of X. Of course, E(Y|Y) = Y because ‘if we are given Y, then we know what Y is’, something which needs considerable clarification!
Oa. Exercise. What is E(Y|X + Y) for this example? (We return to this shortly, but common sense should allow you to write down the answer even without knowing how things are defined.)
- Type
- Chapter
- Information
- Weighing the OddsA Course in Probability and Statistics, pp. 383 - 439Publisher: Cambridge University PressPrint publication year: 2001