Book contents
- Frontmatter
- Contents
- PREFACE
- CHAPTER 1 SHIFT SPACES
- CHAPTER 2 SHIFTS OF FINITE TYPE
- CHAPTER 3 SOFIC SHIFTS
- CHAPTER 4 ENTROPY
- CHAPTER 5 FINITE-STATE CODES
- CHAPTER 6 SHIFTS AS DYNAMICAL SYSTEMS
- CHAPTER 7 CONJUGACY
- CHAPTER 8 FINITE-TO-ONE CODES AND FINITE EQUIVALENCE
- CHAPTER 9 DEGREES OF CODES AND ALMOST CONJUGACY
- CHAPTER 10 EMBEDDINGS AND FACTOR CODES
- CHAPTER 11 REALIZATION
- CHAPTER 12 EQUAL ENTROPY FACTORS
- CHAPTER 13 GUIDE TO ADVANCED TOPICS
- BIBLIOGRAPHY
- NOTATION INDEX
- INDEX
CHAPTER 4 - ENTROPY
Published online by Cambridge University Press: 30 November 2009
- Frontmatter
- Contents
- PREFACE
- CHAPTER 1 SHIFT SPACES
- CHAPTER 2 SHIFTS OF FINITE TYPE
- CHAPTER 3 SOFIC SHIFTS
- CHAPTER 4 ENTROPY
- CHAPTER 5 FINITE-STATE CODES
- CHAPTER 6 SHIFTS AS DYNAMICAL SYSTEMS
- CHAPTER 7 CONJUGACY
- CHAPTER 8 FINITE-TO-ONE CODES AND FINITE EQUIVALENCE
- CHAPTER 9 DEGREES OF CODES AND ALMOST CONJUGACY
- CHAPTER 10 EMBEDDINGS AND FACTOR CODES
- CHAPTER 11 REALIZATION
- CHAPTER 12 EQUAL ENTROPY FACTORS
- CHAPTER 13 GUIDE TO ADVANCED TOPICS
- BIBLIOGRAPHY
- NOTATION INDEX
- INDEX
Summary
Entropy measures the complexity of mappings. For shifts, it also measures their “information capacity,” or ability to transmit messages. The entropy of a shift is an important number, for it is invariant under conjugacy, can be computed for a wide class of shifts, and behaves well under standard operations like factor codes and products. In this chapter we first introduce entropy and develop its basic properties. In order to compute entropy for irreducible shifts of finite type and sofic shifts in §4.3, we describe the Perron–Frobenius theory of nonnegative matrices in §4.2. In §4.4 we show how general shifts of finite type can be decomposed into irreducible pieces and compute entropy for general shifts of finite type and sofic shifts. In §4.5 we describe the structure of the irreducible pieces in terms of cyclically moving states.
Definition and Basic Properties
Before we get under way, we review some terminology and notation from linear algebra.
Recall that the characteristic polynomial of a matrix A is defined to be χA(t) = det(tId – A), where Id is the identity matrix. The eigenvalues of A are the roots of χA(t). An eigenvector of A corresponding to eigenvalue λ is a vector v, not identically 0, such that Av = λv.
We say that a (possibly rectangular) matrix A is (strictly) positive if each of its entries is positive. In this case we write A > 0.
- Type
- Chapter
- Information
- An Introduction to Symbolic Dynamics and Coding , pp. 99 - 135Publisher: Cambridge University PressPrint publication year: 1995