Book contents
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
22 - Proofs of PCP theorems and the Fourier transform technique
from PART THREE - ADVANCED TOPICS
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
Summary
The improvements in the constants has many times been obtained by extracting some important property from a previous protocol, using that protocol as a black box and then adding some conceptually new construction. This is more or less what we do in the current paper. … The long code is universal in that it contains every other binary code as a sub-code. Thus it never hurts to have this code available, but it is still surprising that it is beneficial to have such a wasteful code.
–Johan Håstad, 1997We saw in Chapter 11 that the PCP Theorem implies that computing approximate solutions to many optimization problems is NP-hard. This chapter gives a complete proof of the PCP Theorem. In Chapter 11 we also mentioned that the PCP Theorem does not suffice for proving several other similar results, for which we need stronger (or simply different) “PCP Theorems.” In this chapter we survey some such results and their proofs. The two main results are Raz's Parallel Repetition Theorem (see Section 22.3) and Håstad's Three-Bit PCP Theorem (Theorem 22.16). Raz's theorem leads to strong hardness results for the 2CSP problem over large alphabets. Håstad's theorem shows that certificates for NP languages can be probabilistically checked by examining only three bits in them.
- Type
- Chapter
- Information
- Computational ComplexityA Modern Approach, pp. 460 - 497Publisher: Cambridge University PressPrint publication year: 2009