Systems | Information | Learning | Optimization
 

Large sample asymptotics of spectra of Laplacians and semilinear elliptic PDEs on random geometric graphs.

Given a data set $\mathcal{X}=\{x_1, \dots, x_n\}$ and a weighted graph structure $\Gamma= (\mathcal{X},W)$ on $\mathcal{X}$, graph based methods for learning use analytical notions like graph Laplacians, graph cuts, and Sobolev semi-norms to formulate optimization problems whose solutions serve as sensible approaches to machine learning tasks. When the data set …

Safety and Robustness Guarantees with Learning in the Loop

In this talk, we present recent progress towards developing learning-based control strategies for the design of safe and robust autonomous systems. Our approach is to recognize that machine learning algorithms produce inherently uncertain estimates or predictions, and that this uncertainty must be explicitly quantified (e.g., using non-asymptotic guarantees of contemporary …

Hardware Accelerators for Deep Learning: A Proving Ground for Specialized Computing

The computing industry has a power problem: the days of ideal transistor scaling are over, and chips now have more devices than can be fully powered simultaneously, limiting performance. New architecture-level solutions are needed to continue scaling performance, and specialized hardware accelerators are one such solution. While accelerators promise to …

Learning From Sub-Optimal Data

Learning algorithms typically assume their input data is good natured. If one takes this input data and trains an agent with it, then the agent should, given enough time and compute, eventually learn how to solve the intended task. But this is not always a realistic expectation. Sometimes, the data …