Distributed Functional Compression through Graph Coloring

Year
2007
Type(s)
Author(s)
V. Doshi, D. Shah, M. Medard, S. Jaggi
Source
Data Compression Conference, 2007. DCC '07, Snowbird, UT, 2007, pp. 93-102
Url
http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=4148748

We consider the distributed computation of a function of random sources with minimal communication. Specifically, given two discrete memoryless sources, X and Y, a receiver wishes to compute f(X, Y) based on (encoded) information sent from X and Y in a distributed manner. A special case, f(X, Y) = (X, Y), is the classical question of distributed source coding considered by Slepian and Wolf (1973). Orlitsky and Roche (2001) considered a somewhat restricted setup when Y is available as side information at the receiver. They characterized the minimal rate at which X needs to transmit data to the receiver as the conditional graph entropy of the characteristic graph of X based on f. In our recent work (2006), we further established that this minimal rate can be achieved by means of graph coloring and distributed source coding (e.g. Slepian-Wolf coding). This characterization allows for the separation between “function coding” and “correlation coding.” In this paper, we consider a more general setup where X and Y are both encoded (separately). This is a significantly harder setup for which to give a single-letter characterization for the complete rate region. We find that under a certain condition on the support set of X and Y (called the zigzag condition), it is possible to characterize the rate region based on graph colorings at X and Y separately. That is, any achievable pair of rates can be realized by means of first coloring graphs at X and Y separately (function coding) and then using Slepian-Wolf coding for these colors (correlation coding). We also obtain a single-letter characterization of the minimal joint rate. Finally, we provide simulation results based on graph coloring to establish the rate gains on real sequences