Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Introduction
- 2 Modulation
- 3 Demodulation
- 4 Synchronization and noncoherent communication
- 5 Channel equalization
- 6 Information-theoretic limits and their computation
- 7 Channel coding
- 8 Wireless communication
- Appendix A Probability, random variables, and random processes
- Appendix B The Chernoff bound
- Appendix C Jensen's inequality
- References
- Index
7 - Channel coding
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Introduction
- 2 Modulation
- 3 Demodulation
- 4 Synchronization and noncoherent communication
- 5 Channel equalization
- 6 Information-theoretic limits and their computation
- 7 Channel coding
- 8 Wireless communication
- Appendix A Probability, random variables, and random processes
- Appendix B The Chernoff bound
- Appendix C Jensen's inequality
- References
- Index
Summary
In this chapter, we provide an introduction to some commonly used channel coding techniques. The key idea of channel coding is to introduce redundancy in the transmitted signal so as to enable recovery from channel impairments such as errors and erasures. We know from the previous chapter that, for any given set of channel conditions, there exists a Shannon capacity, or maximum rate of reliable transmission. Such Shannon-theoretic limits provide the ultimate benchmark for channel code design. A large number of error control techniques are available to the modern communication system designer, and in this chapter, we provide a glimpse of a small subset of these. Our emphasis is on convolutional codes, which have been a workhorse of communication link design for many decades, and turbo-like codes, which have revolutionized communication systems by enabling implementable designs that approach Shannon capacity for a variety of channel models.
Map of this chapter We begin in Section 7.1 with binary convolutional codes. We introduce the trellis representation and the Viterbi algorithm for ML decoding, and develop performance analysis techniques. The structure of the memory introduced by a convolutional code is similar to that introduced by a dispersive channel. Thus, the techniques are similar to (but simpler than) those developed for MLSE for channel equalization in Chapter 5. Concatenation of convolutional codes leads to turbo codes, which are iteratively decoded by exchanging soft information between the component convolutional decoders.
- Type
- Chapter
- Information
- Fundamentals of Digital Communication , pp. 293 - 378Publisher: Cambridge University PressPrint publication year: 2008
- 1
- Cited by