Compressed sensing is all about hitting sparse vectors with short, fat matrices. Specifically, we want to be able to efficiently and stably recover any sparse vector from a corresponding matrix-vector product. I recently wrote a paper that discusses how one might design matrices with this particular application in mind:
To reconstruct a sparse vector from relatively few measurements , it has become popular to find an estimate of minimal 1-norm. This estimate happens to be the sparsest vector in the preimage of provided satisfies the restricted isometry property (RIP). In words, an RIP matrix acts as a near-isometry on sufficiently sparse vectors, and since its introduction in 2004, this property has become an important subject of matrix design (see Terry Tao’s blog post on the problem).
To date, the best known RIP matrices are constructed using random processes, while deterministic constructions have found less success. This paper considers various methods for demonstrating RIP deterministically, and it makes some interesting connections with graph theory and number theory.