We consider the following rate distortion problem: given a source X and correlated, decoder side information Y, find the minimum encoding rate for X required to compute f(X,Y) at the decoder within distortion D. This is a generalization of the classical Wyner-Ziv setup and was resolved by Yamamoto (1982). However, this result involved an auxiliary random variable that lacks explicit meaning. To provide a more direct link between this variable and the function f, Orlitsky and Roche (2001) established the minimal rate required in the zero-distortion case as an extension of Korner’s graph entropy. Recently, we (with Jaggi) showed that the zero-distortion rate can be achieved by minimum entropy graph coloring of an appropriate product graph. This leads to a modular architecture for functional source coding with a preprocessing “functional coding” scheme operating on top of a classical Slepian-Wolf source coding scheme. In this paper, we give a characterization of Yamamoto’s rate distortion function in terms of a reconstruction function. This (non-single-letter) characterization is an extension of our previous results as well as Orlitsky and Roche’s results. We obtain a modular scheme operating with Slepian-Wolf’s scheme for the problem of functional rate distortion. Further, we give an achievable rate (with single-letter characterization) utilizing this scheme that intuitively extends our previous results.