Inferring rankings using constrained sensing

Year
2011
Type(s)
Author(s)
Jagabathula S. and D. Shah
Source
IEEE Transactions on Information Theory, Volume 57, No. 11, pp. 7288-7306, 2011
Url
http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6071756

We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over n elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size << domain size). Our recovery method is based on finding the sparsest solution (through l0 optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through l0 optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as n → ∞. ℓ0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, ℓ1 optimization, to find the sparsest solution. However, we show that ℓ1optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as n → ∞. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible.