Industrial & Enterprise Systems Engineering Calendar

View Full Calendar

Dissertation Defense for Siqi Zhang

Event Type
Seminar/Symposium
Sponsor
ISE Graduate Programs
Virtual
wifi event
Date
Jun 13, 2022   10:00 am - 12:00 pm  
Speaker
Siqi Zhang
Contact
Lauren Redman
E-Mail
lredman@illinois.edu
Phone
217-333-2252
Views
43
Originating Calendar
ISE Academic Programs

Title: Structures and Complexities of Modern Nonconvex Optimization: Fundamental Limits and Efficient Algorithms

Abstract: Many modern machine learning applications are formulated as nonconvex optimization problems. The study of nonconvex optimization comes with two perspectives: finding the fundamental computational limits and designing efficient algorithms, which corresponds to lower and upper complexity bounds in optimization theory. Recently the rapidly developing machine learning technology brings much more new computational challenges due to their intrinsically more complicated structure, which urges the development of new theory and efficient algorithms. The thesis studies several nonconvex optimization problems with special structures, which have broad applications in machine learning, the discussion covers both the upper and lower complexity bound perspectives.

In the first part, we study the convergence of Stochastic Mirror Descent (SMD) for nonsmooth nonconvex minimization in non-Euclidean settings. We focus on stochastic optimization problems consisting of relatively weakly convex functions and a simple convex regularizer. We prove that, without the use of mini-batch, SMD is guaranteed to converge to a stationary point with a non-asymptotic sample complexity guarantee. The efficiency estimation matches with existing results for stochastic subgradient method, while evaluated under a stronger stationarity measure.

In the second part, we consider Conditional Stochastic Optimization (CSO) problems, which covers a variety of applications like invariant learning, causal inference and meta-learning. Here constructing unbiased gradient estimators for such problems is challenging due to its composition structure. We propose several algorithms and establish their sample complexities for general nonconvex problems, which we later showed match the corresponding lower complexity bound. We further conduct numerical experiments on several tasks to illustrate the performance of proposed algorithms.

In the third part, we study the upper and lower complexity bounds of nonconvex minimax optimization problems. We establish nontrivial lower complexity bounds of for Nonconvex-Strongly-Concave (NC-SC) minimax problems in the deterministic, finite-sum and purely stochastic cases, respectively. Our results reveal substantial gaps between these limits and best-known upper bounds in the literature, which motivates us to further design improved algorithms to fix these gaps.

link for robots only