In many signal processing and machine learning applications, the desired model has \emph{multiple} structures simultaneously. Examples include recovering a sparse signal from phaseless measurements (sparse phase retrieval), and learning models with several structural priors in machine learning tasks.
Often convex penalties that promote individual structures are known, and require a minimal number of generic measurements (e.g.,$\ell$1 norm for sparsity, nuclear norm for matrix rank), so it is reasonable to minimize a combination of such norms to recover a simultaneously structured model. We show that, surprisingly, if we use multiobjective optimization with the individual norms, we can do no better (order-wise) in terms of required measurements than an algorithm that exploits only one of the structures. This result holds in a general setting and suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation, not one that is a function of relaxations used for each structure. We also consider denoising for simultaneously structured signals, and provide bounds on the minimax denoising risk.
Discovery Building, Orchard View Room
Maryam Fazel