טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
Ph.D Thesis
Ph.D StudentHuleihel Wasim
SubjectOn Relations between Information Theory and Statistical
Mechanics
DepartmentDepartment of Electrical Engineering
Supervisor Professor Neri Merhav
Full Thesis textFull thesis text - English Version


Abstract

The connections between information theory and statistical physics have been known over the last few decades, both conceptually and technically. One noticeable conceptual aspect is in borrowing statistical-mechanical insights by identifying parallelisms and drawing analogies between problems pertaining to certain information-theoretic settings and structures arising in statistical physics. On the technical aspect, after identifying these parallelisms, powerful mathematical tools and analysis techniques can be borrowed from one field to another. The proposed research is addressing some important information-theoretic questions that lie at the interface between the two disciplines.


We start by demonstrating how certain analysis tools that are customary in statistical physics, prove useful in the analysis of the optimum estimation and information measures. Accordingly, in the problems considered, the corresponding statistical-mechanical systems turn out to consist of strong interactions that cause phase transitions, which in turn are reflected as irregularities and discontinuities in the behavior of the estimation performance and information measures. Specifically, we derive the asymptotic MMSE for estimation of the transmitted data based on noisy observations under a general model. This led us to obtain several insightful information-theoretic conclusions regarding our capability to achieve reliable communication. Furthermore, we were able to find the mismatched MSE, which is estimation error under wrong assumptions on the channel input statistics, the noise statistics, and other parameters of the underlying model.


In the second part of this thesis, we continue by considering a very popular topic in information theory and signal processing, called compressed sensing (CS). CS is a formalism that allows reconstruction of sparse signals from a reduced number of measurements. Similarly to the previous problem, there is a great interest in the behavior of the asymptotic MMSE under the compressed model. All previously reported results concerning the derivation of the asymptotic MMSE were based on the well-known, but non-rigorous replica method. In the literature, it is considered a big challenge to obtain these results rigorously. Using methods rooted in statistical mechanics and advanced random matrix theory, we were able to mathematically rigorize and generalize these results. Also, in contrast to previous works in which only memoryless sparse signal were considered, we consider a more general model which allows a certain structured dependency among the various components of the sparse signal.


In the last part of this work, following the previous paragraph, we also take an information theoretic-perspective on the same model, and present an analytical expression for the mutual information, which plays central role in a variety of communications/processing problems. Such an expression was addressed previously by bounds, simulations, and by the (non-rigorous) replica method. The expression of the mutual information is based on the above mentioned techniques. Using this expression, we study a variety of sparse linear communications models which include coding in different settings, accounting also for multiple access channels and wiretap problems. For these channels, we provide single-letter expressions and derive achievable rates, capturing the communications/processing features of these timely models.