Systems | Information | Learning | Optimization
 

Data-Driven Discovery and Control of Complex Systems: Uncovering Interpretable and Generalizable Nonlinear Models

Accurate and efficient reduced-order models are essential to understand, predict, estimate, and control complex, multiscale, and nonlinear dynamical systems. These models should ideally be generalizable, interpretable, and based on limited training data. This work develops a general framework to discover the governing equations underlying a dynamical system simply from data …

Multistage Distributionally Robust Optimization with Total Variation Distance: Modeling and Effective Scenarios

Traditional multistage stochastic optimization assumes the underlying probability distribution is known. However, in practice, the probability distribution is often not known or cannot be accurately approximated. One way to address such distributional ambiguity is to use distributionally robust optimization (DRO), which minimizes the worst-case expected cost with respect to a …

Information-theoretic Privacy: Leakage measures, robust privacy guarantees, and generative adversarial mechanism design

Privacy is the problem of ensuring limited leakage of information about sensitive features while sharing information (utility) about non-private features to legitimate data users. Even as differential privacy has emerged as a strong desideratum for privacy, there is also an equally strong need for context-aware utility-guaranteeing approaches in many data …

Deep Learning for Electronic Structure Computations: A Tale of Symmetries, Locality, and Physics

Recently, the surge of interest in deep neural learning has dramatically improved image and signal processing, which has fueled breakthroughs in many domains such as drug discovery, genomics, and automatic translation. These advances have been further applied to scientific computing and, in particular, to electronic structure computations. In this case, …

Learning with Dependent Data

Several important families of computational and statistical results in machine learning and randomized algorithms rely on statistical independence of data. The scope of such results include the Johnson-Lindenstrauss Lemma (JLL), the Restricted Isometry Property (RIP), regression models, and stochastic optimization. In this talk, we will discuss a new result on …

Spectral relaxations and branching strategies for global optimization of mixed-integer quadratic programs

We consider the global optimization of nonconvex quadratic programs and mixed-integer quadratic programs. We present a family of convex quadratic relaxations which are derived by convexifying nonconvex quadratic functions through perturbations of the quadratic matrix. We investigate the theoretical properties of these quadratic relaxations and show that they are equivalent …