|M.Sc Student||Maor Gal|
|Subject||Information Theory and Privacy Related Questions in|
|Department||Department of Computer Science||Supervisor||Professor Eliyahu Ben Sasson|
|Full Thesis text|
We study the correlation between the communication complexity of a task, i.e. the number of bits that the communicating parties must send in order to solve it, and the internal and external information complexities, a measure of how much information the parties need to reveal to each other, or to an external observer, about their inputs. We compare the relation between these measures to the tight relation between the communication needed of one way message sending and the entropy function, and show the cases in which these are analogues and the cases in which they are not.
We provide the first example of a computational task for which the internal and external information complexities are exponentially apart, even though the computation output itself does not reveal much information. We discuss the characteristics of the communication model we define, in which at the end of a computation the output must be broadcast to the public.
We discuss the connection between using private randomness as a method of hiding information from the other parties while performing a computational task, and also provide a full proof of a long time known theorem showing that for every boolean function, the communication complexity is at least logarithmic in the input size, if we don't allow the usage of shared randomness.