Systems | Information | Learning | Optimization
 

Statistical Filtering for Optimization over Expectation Operators

The problem of optimizing objective functions that involve expectation or integral operators is common in a number of fields. This problem is commonly addressed using one of three frameworks or hybrids thereof: Sample Average Approximation/Monte Carlo (SAA/MC), Bayesian Optimization (BO), and Stochastic Approximation (SA). While the methods that belong to such frameworks have well-controlled theoretical properties and certain desirable computational properties, they often have certain crippling practical challenges. For example, in the SAA/MC and BO paradigms, we can generate increasingly accurate estimates of the objective function and its derivatives, which is valuable in the evaluation of iterates and hyperparameters, yet this comes at an increasing cost. On the other hand, in the SA paradigm, we can handle increasing amounts of data with a fixed per iteration cost, yet we lose the ability to evaluate iterates and hyperparameters, which can result in painfully slow convergence or exponential divergence. In this talk, we will introduce a new paradigm to achieve the best of both worlds: we will generate increasingly accurate estimates of the objective and its derivatives with a fixed per iteration cost. We will demonstrate its potential on three problems coming from statistics, machine learning, and stochastic optimal control.
November 21 @ 12:30
12:30 pm (1h)

Discovery Building, Orchard View Room

Vivak Patel