![](https://assets.cambridge.org/97805217/66982/cover/9780521766982.jpg)
Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
12 - Gaussian networks
Published online by Cambridge University Press: 05 August 2014
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
Summary
There are many ways in which we can use side-information paradigms as building blocks in general multi-terminal networks. Two such cases were discussed in Chapter 10: the broadcast channel (Section 10.1.4) and distributed compression of correlated sources (Section 10.2).
In these simple settings, the side information is concentrated in the “relevant” terminal in the network. In the broadcast channel, for example, the joint encoder may view the transmission to one terminal as side information for the transmission to another terminal. In multi-terminal source coding, the joint decoder may view the reconstruction of one source as side information for the reconstruction of another source. Lattice coding schemes, such as the lattice dirty-paper and Wyner–Ziv coding schemes of Chapter 10, can reduce the complexity (and perhaps offer some intuition) compared to random coding solutions. But they do not give us any performance advantage over the random i.i.d. coding solutions, which are known to be optimal for these settings.
A more interesting situation, however, occurs when side information is distributed among more than one terminal. Surprisingly, it turns out that in some distributed linear network topologies, structured solutions outperform random coding solutions. Moreover, in some cases they are, in fact, asymptotically optimal.
A common theme in the settings we discuss in this chapter is the distributed computation of a many-to-one function.
- Type
- Chapter
- Information
- Lattice Coding for Signals and NetworksA Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory, pp. 313 - 371Publisher: Cambridge University PressPrint publication year: 2014