|Ph.D Student||Shuval Boaz|
|Subject||Polar Codes: Lower Bounds and Performance Under Memory|
|Department||Department of Electrical Engineering||Supervisor||Professor Ido Tal|
|Full Thesis text|
Polar codes are a family of capacity-achieving codes with explicit and low-complexity construction, encoding, and decoding algorithms. Originally proposed for channel-coding over binary-input, memoryless, and symmetric channels, they have been expanded in many ways. This thesis expands polar coding theory in two important directions.
First, we develop lower bounds on the error probability of polar codes in their original setting: as channel codes for binary-input, memoryless, and symmetric channels. Performance analysis of polar codes is based on analyzing their decoder, the successive-cancellation decoder. It operates sequentially, decoding the codeword in a bit-wise manner. The codeword bits are correlated, yet previous performance analysis of polar codes ignores this: the upper bound is based on the union bound, and the lower bound is based on the worst-performing bit. We propose an improvement of the lower bound, by considering error probabilities of two bits simultaneously. Since these are difficult to compute explicitly, we devise a method to lower-bound the error probabilities of bit pairs. Our method yields lower bounds that significantly improve upon currently known lower bounds.
Second, we remove the memoryless assumption, enabling the use of polar codes under memory. Memory is prevalent in many communication scenarios, at either the channel, source, or both, and effective coding techniques for such scenarios are desirable. In this thesis we show that polar codes can yield codes with vanishing error probability under memory, enabling their use in settings such as input-constrained channels, channels with intersymbol interference, and finite-state channels.
We first consider polar coding for a stationary process with memory whose distribution is known. E.g., the process may be a channel and its input, both with memory. We show that if the process has an underlying hidden Markov structure that is regular (aperiodic and irreducible), then polar codes may achieve vanishing error probability at rates approaching the process information rate. We show this by utilizing the hidden state in the analysis of polar codes, and using the total variation distance as another parameter characterizing a joint distribution. Next, we consider a channel-coding scenario where the encoder knows that the channel belongs to some set, and the decoder knows the precise channel distribution. The input distribution is fixed and known to both. Memory may be present throughout. We present a polarization-based universal code for this scenario: its error probability vanishes over any channel in the set, and its rate approaches the infimal information rate over the set. We establish universality provided that each process in the set has memory in the form of an underlying hidden regular Markov chain, and satisfies a ‘forgetfulness’ property. Forgetfulness occurs when two hidden Markov states become approximately independent given a sufficiently-long sequence of observations between them. Regularity of the Markov chain does not ensure forgetfulness, so we develop a sufficient condition for forgetfulness.