M.Sc Thesis

M.Sc StudentRam Eshed
SubjectOn Renyi Entropy Power Inequalities
DepartmentDepartment of Electrical and Computer Engineering
Supervisor PROF. Igal Sason
Full Thesis textFull thesis text - English Version


One of the well-known inequalities in information theory is the entropy power inequality (EPI) which has been introduced by Shannon in his 1948 landmark paper.  
The EPI has proved to be an instrumental tool in proving converse theorems
for the capacity region of the Gaussian broadcast channel, the capacity region of the Gaussian wire-tap channel, the capacity region of the Gaussian interference channel, the capacity region of the Gaussian broadcast multiple-input multiple-output (MIMO)
channel, and a converse theorem in multi-terminal lossy compression. Due to its importance, the EPI has been proved with information-theoretic tools in several insightful ways. 
More studies on the theme include EPIs for discrete random variables and some analogies, generalized EPIs, reverse EPIs, related inequalities to the EPI in terms of rearrangements, and some refined versions of the EPI for specialized distributions. 
The Rényi entropy and divergence have been introduced by Rényi in the sixties, as a generalization of the Shannon entropy and the Kullack-Leibler relative entropy, and they evidence a long track record of usefulness in information theory and its applications. It is therefore of interest to consider the generalization of the EPI to Rényi measures. 
This Master thesis focuses on Rényi entropy power inequalities (R-EPI), which generalize the celebrated EPI of Shannon to the Rényi entropy power. 
Consider a sum of Sn = X1?n independent continuous random vectors taking values in Rd, and let a >1 (a may be infinity).
A Rényi Entropy Power Inequality provides a lower bound on the Rényi entropy power of Sn of an arbitrary order a that, up to a multiplicative constant (which may depend in general on n, a and d), is equal to the sum of the Rényi entropy powers (of the same order a) of the n random vectors {Xk}, k=1,?, n. 
For a=1, the R-EPIs derived in the thesis by the author coincide with the entropy power inequality by Shannon.
The first R-EPI is an improvement of a recent R-EPI by Bobkov and Chistyakov (IEEE Trans. on Information Theory, Feb. 2015) which relies on the sharpened Young's inequality. A further improvement of the R-EPI relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix. The latter form of the R-EPI improves both the inequality by Bobkov and Chistyakov as well as a bound by Bercher and Vignat, and it forms the tightest R-EPI known to date. The improvements are exemplified in the context of data filtering, and are shown to be asymptotically tight as a tends to infinity for n=2.