Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction
- 2 Error-detecting codes
- 3 Repetition and Hamming codes
- 4 Data compression: efficient coding of a random message
- 5 Entropy and Shannon's Source Coding Theorem
- 6 Mutual information and channel capacity
- 7 Approaching the Shannon limit by turbo coding
- 8 Other aspects of coding theory
- References
- Index
1 - Introduction
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction
- 2 Error-detecting codes
- 3 Repetition and Hamming codes
- 4 Data compression: efficient coding of a random message
- 5 Entropy and Shannon's Source Coding Theorem
- 6 Mutual information and channel capacity
- 7 Approaching the Shannon limit by turbo coding
- 8 Other aspects of coding theory
- References
- Index
Summary
Systems dedicated to the communication or storage of information are commonplace in everyday life. Generally speaking, a communication system is a system which sends information from one place to another. Examples include telephone networks, computer networks, audio/video broadcasting, etc. Storage systems, e.g. magnetic and optical disk drives, are systems for storage and later retrieval of information. In a sense, such systems may be regarded as communication systems which transmit information from now (the present) to then (the future). Whenever or wherever problems of information processing arise, there is a need to know how to compress the textual material and how to protect it against possible corruption. This book is to cover the fundamentals of information theory and coding theory, to solve the above main problems, and to give related examples in practice. The amount of background mathematics and electrical engineering is kept to a minimum. At most, simple results of calculus and probability theory are used here, and anything beyond that is developed as needed.
Information theory versus coding theory
Information theory is a branch of probability theory with extensive applications to communication systems. Like several other branches of mathematics, information theory has a physical origin. It was initiated by communication scientists who were studying the statistical structure of electrical communication equipment and was principally founded by Claude E. Shannon through the landmark contribution [Sha48] on the mathematical theory of communications. In this paper, Shannon developed the fundamental limits on data compression and reliable transmission over noisy channels.
- Type
- Chapter
- Information
- A Student's Guide to Coding and Information Theory , pp. 1 - 12Publisher: Cambridge University PressPrint publication year: 2012