Systems | Information | Learning | Optimization
 

Enabling Fast and Robust Federated Learning

In many large-scale machine learning applications, data is acquired and processed at the edge nodes of the network such as mobile devices, users’ devices, and IoT sensors. Federated learning is a recent distributed learning paradigm according to which a model is trained over a set of edge devices. While federated learning can enable a variety of new applications, it faces major bottlenecks that severely limit its reliability and scalability including communications bottleneck and system’s heterogeneity bottleneck. In this talk, we first focus on communication-efficient federated learning, and present FedPAQ, a novel communication-efficient and scalable Federated learning method with Periodic Averaging and Quantization. FedPAQ is provably near-optimal in the following sense. Under the problem setup of expected risk minimization with independent and identically distributed data points, when the loss function is strongly convex the proposed method converges to the optimal solution with near-optimal rate, and when the loss function is non-convex it finds a first-order stationary point with near-optimal rate. In the second part of the talk, we develop a robust federated learning algorithm that achieves satisfactory performance against distribution shifts in users’ samples. Throughout, we show several numerical results to empirically support our theoretical results.
December 9 @ 12:30
12:30 pm (1h)

Remote

Ramtin Pedarsani

Video