We consider the remote computation of a function of two sources where one is receiver side information. Specifically, given side information Y, we wish to compute f(X, Y) based on information transmitted by X over a noise-less channel. The goal is to characterize the minimal rate at which X must transmit information to enable the computation of the function f. Recently, Orlitsky and Roche (2001) established that the conditional graph entropy of the characteristic graph of the function is a solution to this problem. Their achievability scheme does not separate “functional coding” from the well understood distributed source coding problem. In this paper, we seek a separation between the functional coding and the coding for correlation. In other words, we want to preprocess X (with respect to f), and then transmit the preprocessed data using a standard Slepian-Wolf coding scheme at the Orlitsky-Roche rate. We establish that a minimum (conditional) entropy coloring of the product of characteristic graphs is asymptotically equal to the conditional graph entropy. This can be seen as a generalization of a result of Korner (1973) which shows that the minimum entropy coloring of product graphs is asymptotically equal to the graph entropy. Thus, graph coloring provides a modular technique to achieve coding gains when the receiver is interested in decoding some function of the source.