We tackle the linear inverse problems revolving around low-rank matrices by preserving their non-convex structure. To this end, we present and analyze Matrix ALPS, a new set of low-rank recovery algorithms within the class of hard thresholding methods. We provide strategies on how to set up these algorithms via basic “ingredients” for different configurations to achieve complexity vs. accuracy tradeoffs. Moreover, we propose acceleration schemes by utilizing memory-based techniques and randomized, ε-approximate, low-rank projections to speed-up the convergence as well as decrease the computational costs in the recovery process. For all these cases, we present theoretical analysis that guarantees convergence under mild problem conditions.
The algorithm is presented in “Matrix Recipes for Hard Thresholding Methods“, Technical Report, by Anastasios Kyrillidis and Volkan Cevher.
Moreover, we propose Matrix ALPS for recovering a sparse plus low-rank decomposition of a matrix given its corrupted and incomplete linear measurements. Our approach is a first-order projected gradient method over non-convex sets, and it exploits a well-known memory-based acceleration technique. We theoretically characterize the convergence properties of M ATRIX ALPS using the stable embedding properties of the linear measurement operator. We then numerically illustrate that our algorithm outperforms the existing convex as well as non-convex state-of-the-art algorithms in computational efficiency without sacrificing stability.
The algorithm is presented in “Matrix ALPS: Accelerated Low Rank and Sparse Matrix Rreconstruction“, Technical Report, by Anastasios Kyrillidis and Volkan Cevher.