Book contents
- Frontmatter
- Contents
- List of abbreviations and acronyms
- Preface
- Acknowledgments
- 1 Introduction
- Part I Probability, random variables, and statistics
- Part II Transform methods, bounds, and limits
- Part III Random processes
- 12 Random processes
- 13 Spectral representation of random processes and time series
- 14 Poisson process, birth–death process, and renewal process
- 15 Discrete-time Markov chains
- 16 Semi-Markov processes and continuous-time Markov chains
- 17 Random walk, Brownian motion, diffusion, and Itô processes
- Part IV Statistical inference
- Part V Applications and advanced topics
- References
- Index
13 - Spectral representation of random processes and time series
from Part III - Random processes
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of abbreviations and acronyms
- Preface
- Acknowledgments
- 1 Introduction
- Part I Probability, random variables, and statistics
- Part II Transform methods, bounds, and limits
- Part III Random processes
- 12 Random processes
- 13 Spectral representation of random processes and time series
- 14 Poisson process, birth–death process, and renewal process
- 15 Discrete-time Markov chains
- 16 Semi-Markov processes and continuous-time Markov chains
- 17 Random walk, Brownian motion, diffusion, and Itô processes
- Part IV Statistical inference
- Part V Applications and advanced topics
- References
- Index
Summary
In this chapter we discuss spectral representations and eigenvector-based time-series analysis. We begin our discussion with a review of the Fourier series and Fourier transform of nonrandom functions, followed by the Fourier analysis of periodic WSS processes. Then we introduce the power spectrums of non-periodic WSS random processes, the Wiener–Khinchin formula, and the peoriodogram analysis of timeseries data. The eigenvector-based orthogonal expansion of random vectors and its continuous-time analog, known as the Karhuenen–Loéve expansion, are discussed in detail. Principal component analysis (PCA) and singular-value decomposition (SVD) are two commonly used statistical techniques applicable to any data presentable in matrix form, where correlation exists across its rows and/or columns. We also briefly discuss algorithms being developed for Web information retrieval, and they can be viewed as instances of general spectral expansion, the common theme of the present chapter.
The chapter ends with discussion of an important class of time series known as autoregressive moving average (ARMA), which is widely used in statistics and econometrics. Its spectral representation and state space formulation are also discussed.
Spectral representation of random processes and time series
In this section we consider the problem of representing a random process in terms of a series or integral with respect to some system of deterministic functions, such that the coefficients in this expansion are uncorrelated RVs. Such a representation is referred to as spectral representation or spectral expansion. Before we pursue this subject, let us briefly review the Fourier series expansion.
- Type
- Chapter
- Information
- Probability, Random Processes, and Statistical AnalysisApplications to Communications, Signal Processing, Queueing Theory and Mathematical Finance, pp. 343 - 399Publisher: Cambridge University PressPrint publication year: 2011