A network of nodes communicate via noisy channels. Each node has some real-valued initial measurement or message. The goal of each of the nodes is to acquire an estimate of a given function of all the initial measurements in the network. We survey our recent results that relate limitations imposed by the communication constraints to the nodes’ performance in computing the desired function. In particular, we determine a lower bound on computation time that must be satisfied by any algorithm used by the nodes to communicate and compute, so that the mean square error in the nodes’ estimates is within a given interval around zero. We apply the lower bound to a specific scenario where we find the bound to be asymptotically tight. Specifically, we consider a scenario where nodes are required to learn a linear combination of the initial values in the network while communicating over erasure channels. Our results suggest that in this scenario, the computation time depends reciprocally on “conductance.” Conductance is a property of the network which captures the information-flow bottle-neck that arises due to topology and channel capacities.