We describe how Lasso-based linear regression may be corrected to accommodate systematic corruptions, and discuss the challenges due to non-convexity that may arise. Nonetheless, we provide theoretical results guaranteeing that all local optima of the corrected Lasso estimator are close to the global optimum, and describe how composite gradient descent may be used to obtain a near-global optimum with a linear convergence rate. Moving beyond real-valued data, we describe how similar ideas may be used to overcome systematic additive noise in the context of compressed sensing MRI. Finally, we discuss how our results on corrected linear regression lead to new procedures for structural estimation in undirected graphical models with corrupted observations, both in Gaussian and discrete graphical models.
This is joint work with Martin Wainwright.
Discovery Building, Orchard View Room
Po Ling Loh