Abstract
Conventional wisdom and common practice in acquisition and
reconstruction of images or signals from frequency data follows the
basic
principle of the Nyquist density sampling theory. This principle
states that to reconstruct an image/signal, the number of Fourier
samples we
need to acquire must match the desired resolution of the image/
signal, e.g.
the number of pixels in the image.
This talk introduces a newly emerged sampling theory which shows that
this conventional wisdom is inaccurate. We show that perhaps
surprisingly, images or signals of scientific interest can be
recovered accurately and sometimes even exactly from a limited number
of nonadaptive random measurements. In effect, the talk introduces a
theory suggesting ``the possibility of compressed data acquisition
protocols which perform as if it were possible to directly acquire
just the important information about the image of interest.'' In other
words, by collecting a comparably small number of measurements rather
than pixel values, one could in principle reconstruct an image with
essentially the same resolution as that one would obtain by measuring
all the pixels, a phenomenon with far reaching implications.
The reconstruction algorithms are very concrete, stable (in the sense
that they degrade smoothly as the noise level increases) and
practical; in fact, they only involve solving convenient convex
optimization programs.