Logo


 

Analytic Algorithmics, Combinatorics, and Information Theory

Wojciech Szpankowski, Purdue University

Analytic information theory aims at studying problems of information theory using analytic techniques of computer science and combinatorics.  Following Hadamard's and Knuth's precept, we tackle these problems by complex analysis methods such as generating functions, Mellin transform, Fourier series, saddle point method, analytic poissonization and depoissonization, and singularity analysis.

This approach lies at the crossroad of computer science and information theory. In this talk, we concentrate on one facet of information theory (i.e., source coding better known as data compression), namely the redundancy rate problem and types. The redundancy rate problem for a class of sources is the determination of how far the actual code length exceeds the optimal (ideal) code length. The method of types is a powerful technique in information theory, large deviations, and analysis of algorithms. It reduces calculations of the probability of rare events to a combinatorial analysis.  Two sequences are of the same type if they have the same empirical distribution.

We shall argue that counting types can be accomplished efficiently by enumerating Eulerian paths (Markov types) or binary trees with a given path length (universal types). On the other hand, analysis of the redundancy rate problem for memoryless and Markov sources leads us to tree generating functions (e.g., arising in counting labeled rooted trees) studied extensively in computer science.

 


Last Edited: May 13, 2004
DHTML Menus by http://www.milonic.com/