|Ph.D Student||Michaeli Tomer|
|Subject||Interplay between Sampling and Estimation: A Unified|
Hilbert Space Approach
|Department||Department of Electrical Engineering||Supervisor||Professor Yonina Eldar|
|Full Thesis text|
The area of signal processing is primarily concerned with the extraction of information from signals, which often represent analog physical quantities. A typical signal-processing system first converts the analog signal of interest into a digital representation and then analyzes the resulting data by digital means. A vast amount of signal processing algorithms rely on prior statistical knowledge about the signals encountered in the associated application. Thus, two theories that play significant roles in signal processing are sampling theory and Bayesian estimation theory. The former deals with the process of sampling a continuous-time signal in a way which results in minimum loss of information. The latter is concerned with estimation of random quantities from a set of statistically related measurements, and is at the heart of many algorithms for signal denoising and deconvolution, target tracking, blind-source separation, and more.
Although sampling and estimation are often viewed as different disciplines, in this work we highlight many links and commonalities between them. Perhaps the most immediate connection lies at the intersection between the fields, namely the problem of sampling random signals. Our contributions in this area include the determination of the optimal sampling method for sensing a noisy stochastic process, and the design of efficient single- and multi-channel interpolation algorithms for random signals. We demonstrate our results in the context of sampling finite rate of innovation (FRI) signals, as well as in image interpolation and super-resolution reconstruction from image sequences.
A more fundamental connection between sampling and estimation is based on abstract Hilbert-space formulations of both problems. Specifically, in modern sampling theory, signals are viewed as vectors in some Hilbert space H, and the knowledge of their samples corresponds to knowledge of their projections onto certain subspaces of H. Similarly, the Hilbert-space viewpoint of Bayesian estimation treats random variables as vectors in an abstract Hilbert-space and minimum mean-square error estimation is associated with projection operations within this space. By relying on these formulations, we demonstrate how various estimation techniques can be used to solve sampling problems and, similarly how modern sampling methods have applications in estimation theory. We specifically harness the well-established theory of prediction and causal estimation of random sequences to develop causal interpolation algorithms for deterministic signals. We also show that certain estimation scenarios with partial statistical knowledge can be viewed as generalized sampling problems. We demonstrate our methods in the context of enhancement of facial images, image zooming, and more.