Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction to compressed sensing
- 2 Second-generation sparse modeling: structured and collaborative signal analysis
- 3 Xampling: compressed sensing of analog signals
- 4 Sampling at the rate of innovation: theory and applications
- 5 Introduction to the non-asymptotic analysis of random matrices
- 6 Adaptive sensing for sparse recovery
- 7 Fundamental thresholds in compressed sensing: a high-dimensional geometry approach
- 8 Greedy algorithms for compressed sensing
- 9 Graphical models concepts in compressed sensing
- 10 Finding needles in compressed haystacks
- 11 Data separation by sparse representations
- 12 Face recognition by sparse representation
- Index
2 - Second-generation sparse modeling: structured and collaborative signal analysis
Published online by Cambridge University Press: 05 November 2012
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction to compressed sensing
- 2 Second-generation sparse modeling: structured and collaborative signal analysis
- 3 Xampling: compressed sensing of analog signals
- 4 Sampling at the rate of innovation: theory and applications
- 5 Introduction to the non-asymptotic analysis of random matrices
- 6 Adaptive sensing for sparse recovery
- 7 Fundamental thresholds in compressed sensing: a high-dimensional geometry approach
- 8 Greedy algorithms for compressed sensing
- 9 Graphical models concepts in compressed sensing
- 10 Finding needles in compressed haystacks
- 11 Data separation by sparse representations
- 12 Face recognition by sparse representation
- Index
Summary
In this chapter the authors go beyond traditional sparse modeling, and address collaborative structured sparsity to add stability and prior information to the representation. In structured sparse modeling, instead of considering the dictionary atoms as singletons, the atoms are partitioned in groups, and a few groups are selected at a time for the signal encoding. A complementary way of adding structure, stability, and prior information to a model is via collaboration. Here, multiple signals, which are known to follow the same model, are allowed to collaborate in the coding. The first studied framework connects sparse modeling with Gaussian Mixture Models and leads to state-of-the-art image restoration. The second framework derives a hierarchical structure on top of the collaboration and is well fitted for source separation. Both models enjoy very important theoretical virtues as well.
Introduction
In traditional sparse modeling, it is assumed that a signal can be accurately represented by a sparse linear combination of atoms from a (learned) dictionary. A large class of signals, including most natural images and sounds, is well described by this model, as demonstrated by numerous state-of-the-art results in various signal processing applications.
From a data modeling point of view, sparsity can be seen as a form of regularization, that is, as a device to restrict or control the set of coefficient values which are allowed in the model to produce an estimate of the data.
- Type
- Chapter
- Information
- Compressed SensingTheory and Applications, pp. 65 - 87Publisher: Cambridge University PressPrint publication year: 2012