Error-free source and channel coding with lists
Autoren
Mehr zum Buch
This thesis is comprised of three independent parts. In the first part, a source-coding problem is studied, where a task that is randomly drawn from a finite set of tasks is to be described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ?-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If the rate R is larger than the Rényi entropy rate of the source of order 1/(1 + ? ) (provided it exists), then the ? -th moment of the ratio of performed tasks to n can be driven to one as n tends to infinity. If R is smaller than the Rényi entropy rate, this moment tends to infinity. The results are generalized to account for the presence of side-information. In this more general setting, the key quantity is a conditional version of Rényi entropy that was introduced by Arimoto. For IID sources two additional extensions are solved, one of a rate-distortion flavor and the other where different tasks may have different nonnegative costs. Finally, a divergence that was identified by Sundaresan as a mismatch penalty in the Massey-Arikan guessing problem is shown to play a similar role here. The second part of this thesis studies the listsize capacity of discrete memoryless channels with feedback. The listsize capacity is the supremum of achievable rates, where achievable means that the ? -th moment of the number of messages that could have produced the output of the channel approaches one as the blocklength tends to infinity. It is shown that for channels with feedback this rate is upper-bounded by the maximum of Gallager’s E0 function divided by ?, and that equality holds when the zero-error capacity of the channel is positive. The inequality is established by proving that feedback does not increase the cutoff-rate. Relationships to other notions of channel capacity are explored. The third part of this thesis contains a proof of a conjecture by Ahlswede, Cai, and Zhang on the relationship between the zero-undetected-error capacity of almost noise-free channels and the Sperner capacity of directed graphs. The zero-undetected-error capacity is the supremum of achievable rates when the decoder must either produce the correct message or declare an erasure, where achievable means that the erasure probability tends to zero as the blocklength tends to infinity. Sperner capacity is a generalization of Shannon’s graph capacity to directed graphs. Ahlswede, Cai, and Zhang proved that, in the noise-free limit, the zeroundetected- error capacity is lower bounded by the Sperner capacity of the channel graph, and they conjectured equality. Here an upper bound is derived that proves the conjecture.