|Ph.D Student||Weissman Tsachy|
|Subject||Universal Prediction in the Presence of Noise|
|Department||Department of Electrical Engineering||Supervisor||Professor Neri Merhav|
The main concern of this work is the problem of universally sequentially predicting the next outcome of a binary sequence based on noisy observations of its past. As in the noise-free case, this problem can be naturally formalized and analyzed within two settings: a deterministic and a probabilistic one.
Starting with the deterministic setting, we formalize this problem as one of competing with an arbitrary set of prediction “experts” in a noisy environment. A general approach to the problem is developed and applied in the explicit analysis of a few concrete cases. We then revisit the problem of prediction of an individual sequence using finite memory for the noisy setting. The existence of twofold universal predictors, namely, predictors which are independent of the noise parameter, is established. This setting is also shown to give rise to a new complexity measure for individual sequences.
In the probabilistic setting we consider the case where the clean sequence of interest is generated by a stationary ergodic (but otherwise unknown) source. The existence of universal predictors for this setting is established. The fundamental limitations of prediction performance for this case are assessed through a direct and a converse result. The direct result can be considered a generalization of the celebrated Shannon - McMillan - Breiman theorem (the Asymptotic Equipartition Property).
In the final part of this work, we apply some of the above results and the methodology developed for obtaining these results, to resolve a few problems related to causal and delay-constrained source coding of clean, as well as noise-corrupted, individual sequences. It is shown that for any finite set of such coding schemes of a given rate, there exists a source code (adhering to the same structural and delay limitations) with the same rate whose distortion is, with high probability, almost as small as that of the best scheme in the set, uniformly for all individual sequences.