We study a network of n nodes communicating via channels. The objective of each of the nodes is to compute a given function of the data in the network. Using Information Theoretic inequalities, we derive a lower bound to the information that must be communicated between nodes for the mean square error in their estimates to converge to zero. We use this bound to express a bound on the rate of the channel code when the mean square error is required to converge to zero exponentially with some rate. We also show how the bound can be applied on different cut-sets of a communication network to determine a lower bound to computation time until convergence of the error in the nodes’ estimates to a prescribed interval around zero. Finally, we present a particular scenario for which our bound on the computation time is tight.
Lower bounds on information rates for distributed computation via noisy channels. Available from: https://www.researchgate.net/publication/289270524_Lower_bounds_on_information_rates_for_distributed_computation_via_noisy_channels [accessed Nov 29 2017].