Flexible, Adaptive, and Efficient Algorithms for Decentralized Optimization
Abstract: Decentralized optimization has gained significant attention over the last decades, where multiple agents collaborate to solve an optimization problem over a network without relying on a central coordinator. This paradigm is particularly relevant in federated learning, where data is distributed across clients and privacy concerns, communication costs, and system heterogeneity pose unique challenges. More broadly, such approaches are essential for large-scale distributed systems with applications in machine learning, robotics, sensor networks, and smart grids, where scalability, robustness, and privacy are critical. Gradient tracking algorithms are a widely used class of methods for decentralized optimization, where each iteration consists of two key steps: a computation step, in which agents calculate local gradients, and a communication step, in which they exchange iterates and gradient estimates with neighboring agents. The complexity of these steps varies significantly across different applications with diverse characteristics. In this talk, we introduce novel algorithmic frameworks that decouple these two steps, providing greater flexibility to adapt to various application demands. These flexible frameworks recover classical algorithms as special cases while offering improved adaptability. We establish optimal theoretical convergence and complexity guarantees for these frameworks and demonstrate the benefits of flexibility through numerical results. Additionally, we discuss adaptive network pruning strategies that significantly reduce communication overhead without compromising convergence properties.
Biography: Raghu Bollapragada is an assistant professor in the Operations Research and Industrial Engineering graduate program at the University of Texas at Austin (UT). Before joining UT, he was a postdoctoral researcher in the Mathematics and Computer Science Division at Argonne National Laboratory. He received both his Ph.D. and M.S. degrees in Industrial Engineering and Management Sciences from Northwestern University. During his graduate studies, he was a visiting researcher at INRIA, Paris. His research broadly focuses on designing, developing, and analyzing algorithms for solving large-scale nonlinear optimization problems. He currently serves as the vice chair of the Nonlinear Optimization cluster for the INFORMS Optimization Society. He has received the IEMS Nemhauser Dissertation Award for the best dissertation, the IEMS Arthur P. Hurter Award for outstanding academic excellence, the McCormick Terminal Year Fellowship for an outstanding terminal-year Ph.D. candidate at Northwestern University, and the ME Walker Award at UT.