Systems | Information | Learning | Optimization
 

Gradient Descent For Matrix Completion

Video: https://vimeo.com/201151747

From recommender systems to healthcare analytics low-rank recovery from partial observations is prevalent in modern data analysis. There has been significant progress over the last decade in providing rigorous guarantees for low-rank recovery problems based on convex relaxation techniques. However, the computational complexity of these algorithms render them impractical for large-scale applications. Recent advances in nonconvex optimization explain the surprising effectiveness of simple first-order algorithms for many low rank matrix recovery problems, especially for positive semidefinite matrices. The common theme of these algorithms is to work directly with the low rank factors of the semidefinite variable. In this talk, I will discuss how similar ideas can be applied to rectangular matrix completion. We provide rigorous convergence guarantee to show such simple algorithms are effective and can overcome the scalability limits faced by popular convex relaxation approach.

January 25 @ 12:30
12:30 pm (1h)

Discovery Building, Orchard View Room

Qinqing Zheng