We consider the setup of nonparametric ‘blind regression’ for estimating the entries of a large m×n matrix, when provided with a small, random fraction of noisy measurements. We assume that all rows u∈[m] and columns i∈[n] of the matrix are associated to latent features x1(u) and x2(i) respectively, and the (u,i)-th entry of the matrix, A(u,i) is equal to f(x1(u),x2(i)) for a latent function f. Given noisy observations of a small, random subset of the matrix entries, our goal is to estimate the unobserved entries of the matrix as well as to “de-noise” the observed entries.
As the main result of this work, we introduce a neighbor-based estimation algorithm inspired by the classical Taylor’s series expansion. We establish its consistency when the underlying latent function f is Lipschitz, the latent features belong to a compact domain, and the fraction of observed entries in the matrix is at least max(m−1+δ,n−1/2+δ), for any δ>0. As an important byproduct, our analysis sheds light into the performance of the classical collaborative filtering (CF) algorithm for matrix completion, which has been widely utilized in practice. Experiments with the MovieLens and Netflix datasets suggest that our algorithm provides a principled improvement over basic CF and is competitive with matrix factorization methods.
Our algorithm has a natural extension to tensor completion. For a t-order balanced tensor with total of N entries, we prove that our approach provides a consistent estimator when at least N−⌊2t/3⌋2t+δ fraction of entries are observed, for any δ>0. When applied to the setting of image in-painting (a tensor of order 3), we find that our approach is competitive with respect to state-of-art tensor completion algorithms across benchmark images.