Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- Glossary
- 1 Introduction
- 2 Probability
- 3 Random variables, vectors, and processes
- 4 Expectation and averages
- 5 Second-order theory
- 6 A menagerie of processes
- Appendix A Preliminaries
- Appendix B Sums and integrals
- Appendix C Common univariate distributions
- Appendix D Supplementary reading
- References
- Index
1 - Introduction
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Preface
- Acknowledgements
- Glossary
- 1 Introduction
- 2 Probability
- 3 Random variables, vectors, and processes
- 4 Expectation and averages
- 5 Second-order theory
- 6 A menagerie of processes
- Appendix A Preliminaries
- Appendix B Sums and integrals
- Appendix C Common univariate distributions
- Appendix D Supplementary reading
- References
- Index
Summary
A random or stochastic process is a mathematical model for a phenomenon that evolves in time in an unpredictable manner from the viewpoint of the observer. The phenomenon may be a sequence of real-valued measurements of voltage or temperature, a binary data stream from a computer, a modulated binary data stream from a modem, a sequence of coin tosses, the daily Dow–Jones average, radiometer data or photographs from deep space probes, a sequence of images from a cable television, or any of an infinite number of possible sequences, waveforms, or signals of any imaginable type. It may be unpredictable because of such effects as interference or noise in a communication link or storage medium, or it may be an information-bearing signal, deterministic from the viewpoint of an observer at the transmitter but random to an observer at the receiver.
The theory of random processes quantifies the above notions so that one can construct mathematical models of real phenomena that are both tractable and meaningful in the sense of yielding useful predictions of future behavior. Tractability is required in order for the engineer (or anyone else) to be able to perform analyses and syntheses of random processes, perhaps with the aid of computers. The “meaningful” requirement is that the models must provide a reasonably good approximation of the actual phenomena. An oversimplified model may provide results and conclusions that do not apply to the real phenomenon being modeled.
- Type
- Chapter
- Information
- An Introduction to Statistical Signal Processing , pp. 1 - 9Publisher: Cambridge University PressPrint publication year: 2004