Systems | Information | Learning | Optimization
 

Recovery of Simultaneously Structured Models with Limited Information

Finding models with a low-dimensional structure, given a number of linear observations much smaller than the ambient dimension, has been well-studied in recent years. Examples of such models are sparse vectors, low-rank matrices, and the sum of sparse and low-rank matrices, among others.

In many signal processing and machine learning applications, the desired model has \emph{multiple} structures simultaneously. Examples include recovering a sparse signal from phaseless measurements (sparse phase retrieval), and learning models with several structural priors in machine learning tasks.

Often convex penalties that promote individual structures are known, and require a minimal number of generic measurements (e.g.,$\ell$1 norm for sparsity, nuclear norm for matrix rank), so it is reasonable to minimize a combination of such norms to recover a simultaneously structured model. We show that, surprisingly, if we use multiobjective optimization with the individual norms, we can do no better (order-wise) in terms of required measurements than an algorithm that exploits only one of the structures. This result holds in a general setting and suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation, not one that is a function of relaxations used for each structure. We also consider denoising for simultaneously structured signals, and provide bounds on the minimax denoising risk.

April 23 @ 12:30
12:30 pm (1h)

Discovery Building, Orchard View Room

Maryam Fazel