ZJU-CSE Summer School 2021

Introduction to the summer school

The summer school covers several recent advances in the topic of distributed control, optimization and learning. The main object of the course is to present in an accessible way (lectures, tutorial/seminars) to graduate students some advanced control and optimization methods for large-scale systems that arise in modern control engineering and data science. The content of the course covers basics for convex optimization, first-order optimization methods (e.g., gradient method, stochastic method, accelerated gradient methods and primal-dual methods), decomposition and splitting methods (e.g., dual decomposition, monotone operators and operator splitting, ADMM), as well as recent developed parallel and distributed algorithms. Besides, several tutorial/seminars from leading research scholars will be provided to introduce some frontier advanced topics in distributed control, optimization and learning. To enhance the learning outcome of students, new online learning models, such as SPOC, will be likely employed to promote self-learning and diversity of learning processes, along with a bunch of concrete examples including smart grid, sensor networks and machine learning for the sake of enriching the content of the course. After completing this course, the students are expected to be able to apply the control and optimization techniques learned from this course to large-scale cyber-physical systems as well as other related research areas.

Invited speakers

Course Content

Lectures

  • Day 1. Introduction to the course

    • Lecture I: Background • Convex Set • (Non)-Convex Function • Smoothness • Strong Convexity • Strong Duality • Slater Condition • KKT(Video Link(BiliBili), Slides)

  • Day 2. Convex Optimization

  • Day 3. Distributed Convex Optimization

    • lecture IV: (Par I): Finite-Sum Problem • Distributed Optimization • Distributed Primal Algorithms: DGD, Gradient Tracking, Push-Pull/SONATA(Video Link(BiliBili), Slides)

  • Day 5. Advanced Topics in (Distributed) Optimization

    • Lecture VIII: Composite Optimization • Proximal/Projected Gradient Methods • Proximal Point Method • Monotone Operators • ADMM • Primal-dual formulation • Distributed Primal-Dual Methods(Video Link(BiliBili), Slides)

    • Lecture IX: Accelerated Coordinate descent Methods • Accelerated Stochastic Primal-Dual Methods • Accelerated Gradient Tracking • Acceleration for Escaping saddle point(Video Link(BiliBili), Slides)

Tutorial/Seminars(tentative)

  • Day 1. Distributed Non-Convex Optimization

Not Available 

Title(T/S II): Decentralized learning in the nonconvex world: Recent results
Speaker: Prof Hoi To Wai, Chinese University of Hong Kong
Date/Time: August 9, 2021, 10:00am-12:00pm, China Time (Check your local time here)
Abstract: Decentralized learning has become a critical enabler of the massively connected world that many people envision. In this talk, we discuss four key elements of scalable decentralized optimization and control: optimization problems, data, communication, and computation. We describe how these elements should work together in an effective and coherent manner. We review recent techniques developed for optimizing nonconvex models (i.e., problem classes) that process batch/streaming data (data types) across networks in a decentralized manner (communication and computation paradigm). We describe the intuitions and connections behind a core set of popular decentralized algorithms, emphasizing how to balance computation and communication costs. Practical issues and future research directions will also be discussed.
Speaker Homepage: https://www1.se.cuhk.edu.hk/~htwai/
Video Link(BiliBili), Slides


Not Available 

Title(T/S I):Distributed stochastic non-convex optimization: Optimal regimes and tradeoffs
Speaker: Prof Usman Khan, Tufts University
Date/Time: August 9, 2021, 8:30pm-10:00pm, China Time (Check your local time here)
Abstract: In many emerging applications, it is of paramount interest to learn hidden parameters from data. For example, self-driving cars may use onboard cameras to identify pedestrians, highway lanes, or traffic signs in various light and weather conditions. Problems such as these can be framed as classification, regression, or risk minimization in general, at the heart of which lies stochastic optimization and machine learning. In many practical scenarios, distributed and decentralized learning methods are preferable as they benefit from a divide-and-conquer approach towards data at the expense of local (short-range) communication. In this talk, I will present our recent work that develops a novel algorithmic framework to address various aspects of decentralized stochastic first-order optimization methods for non-convex problems. A major focus will be to characterize regimes where decentralized solutions outperform their centralized counterparts and lead to optimal convergence guarantees. Moreover, I will characterize certain desirable attributes of decentralized methods in the context of linear speedup and networkindependent convergence rates. Throughout the talk, I will demonstrate such key aspects of the proposed methods with the help of provable theoretical results and numerical experiments on real data.
Speaker Homepage: https://www.eecs.tufts.edu/~khan/
Video Link(BiliBili), Slides

  • Day 2. Distributed Convex Optimization

Not Available 

Title(T/S III): Towards Scalable Algorithms for Distributed Optimization and Learning
Speaker: Prof. Cesar Uribe, Rice University
Date/Time: August 10, 2021, 8:30am-10:00am, China Time (Check your local time here)
Abstract: Increasing amounts of data generated by modern complex systems such as the energy grid, social media platforms, sensor networks, and cloud-based services call for attention to distributed data processing, in particular, for the design of scalable algorithms that take into account storage and communication constraints and help to make coordinated decisions. In this talk, we present recently proposed distributed algorithms with optimal convergence rates for optimization problems over networks, where data is stored distributedly. We focus on scalable algorithms and show they can achieve the same rates as their centralized counterparts, with an additional cost related to the structure of the network. We provide application examples to distributed inference and learning, and computational optimal transport.
Speaker Homepage: https://cauribe.rice.edu/
Video Link(BiliBili), Slides


Not Available 

Title(T/S IV): Distributed Optimization over Networks
Speaker: Prof Angelia Nedich, Arizona State University
Date/Time: August 10, 2021, 10:30am-12:00pm, China Time (Check your local time here)
Abstract: TBD






Speaker Homepage: https://angelia.engineering.asu.edu/

  • Day 3. Distributed Algorithms for High-dimensional Learning

Not Available 

Title(T/S V): Bringing Statistical Thinking in Distributed Optimization. Vignettes from statistical inference over Networks (Part I, Part II)
Speaker: Prof Gesualdo Scutari, Purdue University
Date/Time: August 11, 2021, 8:30am-12:00pm, China Time (Check your local time here)
Abstract: There is growing interest in solving large-scale statistical machine learning problems over decentralized networks, where data are distributed across the nodes of the network and no centralized coordination is present (we termed these systems “meshed” networks). Modern massive datasets create a fundamental problem at the intersection of the computational and statistical sciences: how to provide guarantees on the quality of statistical inference given bounds on computational resources, such as time and communication efforts. While statistical-computation tradeoffs have been largely explored in the centralized setting, our understanding over meshed networks is limited: (i) distributed schemes, designed and performing well in the classical low-dimensional regime, can break down in the high-dimensional case; and (ii) existing convergence studies may fail to predict algorithmic behaviors; some are in fact confuted by experiments. This is mainly due to the fact that the majority of distributed algorithms over meshed networks have been designed and studied only from the optimization perspective, lacking the statistical dimension. Throughout some vignettes from low- and high-dimensional statistical inference, this talk goes over some designs and new analyses aiming at bringing statistical thinking in distributed optimization.
Speaker Homepage: https://engineering.purdue.edu/~gscutari/
Video Link(BiliBili)


  • Day 4. Distributed Control and Intelligent Autonomous Systems

Not Available 

Title(T/S VI): Distributed Load Frequency Control in Smart Grids
Speaker: Prof Shichao Liu, Carleton University
Date/Time: August 12, 2021, 8:30am-10:00am, China Time (Check your local time here)
Abstract: In a multi-area smart grid, the load frequency control is used to sustain the frequency of each control area and tie-line power at scheduled values through modifying power set points in case disturbances happen. For the cost-effective operation of the LFC, there are many challenges to be addressed such as controllers with limited energy resources and tremendous information exchanged among a large number of areas in the smart grid. In this talk, we firstly introduce smart grid control and communication structures. Then, the basics of load frequemcy control are covered briefly. At the end, we present distributed control and event-triggering schedule co-design schemes to tackle these challenges. These distributed control and scheduling coordination approaches can reduce the amount of needed information transmissions while not sacrificing the system dynamic performance.
Speaker Homepage: https://doe.carleton.ca/shichao-liu
Video Link(BiliBili), Slides


Not Available 

Title(T/S VII): Smart Sensing and Localization
Speaker: Prof Lihua Xie, Nanyang Technological University
Date/Time: August 12, 2021, 10:30am-12:00pm, China Time (Check your local time here)
Abstract: Sensing and localization are essential for IoT and intelligent unmanned systems. GPS has been widely used for positioning and navigation. However, in indoor and many outdoor environments such as urban canon, forest, tunnel, GPS may not be available or unreliable. Hence, there has been a lot of interest in developing technologies and algorithms for localization in such environments. In this talk, we shall discuss several sensing and localization systems and algorithms we have developed over the past few years including WiFi based indoor positioning and human activity recognition, UWB based localization, and vision-inertial-range sensor fusion for localization and mapping. Their applications in smart building/home, elderly care, UAV based structure inspection and AGV for logistics will be presented, and challenges and future research directions will be highlighted.
Speaker Homepage: https://www3.ntu.edu.sg/home/elhxie/index.html

  • Day 5. Group Sharing and Discussion

Way of Teaching

  • Language: Bilingual (English + Chinese)

  • Course Form: Lectures (Basics) + Tutorials/Seminars (Advanced Topics)

  • Manner: Online (key concepts; Sketch of proofs) + Offline (Mathematical derivation)

Time Schedule

Week One (Aug 02 - Aug 06)

Time Monday (Aug 02) Tuesday(Aug 03) Wednesday(Aug 04) Thursday(Aug 05) Friday(Aug 06)
8.30 am - 12.00 pm
(GMT+8)
Lecture I
Introduction to the course
(Chinese)

Speaker
Jinming Xu, ZJU
(Yuquan Campus)

Tencent Meeting ID:
752 593 984
Lecture II
Convex Optimization
(English)

Speaker
Ying Sun, PSU
(online)

Tencent Meeting ID:
755 843 514
Lecture IV
Distributed Convex Optimization
(English)

Speaker
Ying Sun, PSU
(online)

Tencent Meeting ID:
642 952 965
Lecture V
Stochastic Optimization
(English)

Speaker
Ying Sun, PSU
(online)

Tencent Meeting ID:
708 668 998
Lecture VIII
Advanced Topics(Operator Splitting, ADMM)
(Chinese)

Speaker
Jinming Xu, ZJU
(Yuquan Campus)

Tencent Meeting ID:
479 145 622
12.00 pm - 2.30 pm
(GMT+8)
Lunch Break Lunch Break Lunch Break Lunch Break Lunch Break
2.30 pm - 5.30 pm
(GMT+8)
Lab Tour

Shining Gao/Anjun Chen
(Yuquan Campus)
Lecture III
Graph Basics and Consensus
(Chinese)

Speaker
Prof Chengcheng Zhao, ZJU
(Yuquan Campus)

Tencent Meeting ID:
624 997 500
Research & Discussion Lecture VI
Distributed Stochastic Optimization

Speaker
Kun Yuan, Damo Academy
(Yuquan Campus)

Tencent Meeting ID:
202 219 103
Lecture IX
Advanced Topics(Acceleration)
(Chinese)

Speaker
Huan Li, NKU
(Yuquan Campus)

Tencent Meeting ID:
962 195 511


Week Two (Aug 09 - Aug 13)

Time Monday (Aug 09) Tuesday(Aug 10) Wednesday(Aug 11) Thursday(Aug 12) Friday(Aug 13)
8.30 am - 12.00 pm
(GMT+8)
T/S I
Prof. Usman Khan, Tufts Univ
(8.30pm-10.00pm)
Zoom Meeting ID:
499 292 8524


T/S II
Prof. Hoi To Wai, CUHK
(10.00am-12.00pm)

Zoom Meeting ID:
499 292 8524
T/S III
Prof Cesar Uribe, Rice
(8.30am-10.00am)

T/S IV
Prof Angelia Nedich, ASU
(10.30am-12.00pm)

Zoom Meeting ID:
499 292 8524
T/S V
Part I/Part II

Prof Gesualdo Scutari, Purdue

Zoom Meeting ID:
499 292 8524
T/S VI
Prof Shichao liu, Carleton
(8.30am-10.00am)

T/S VII
Prof Xie Lihua, NTU
(10.30am-12.00pm)

Zoom Meeting ID:
499 292 8524
Group Sharing & Discussion
12.00 pm - 2.30 pm
(GMT+8)
Lunch Break Lunch Break Lunch Break Lunch Break Lunch Break
2.30 pm - 5.30 pm
(GMT+8)
Lecture VII
Distributed Control

Speaker
Prof Meng Wenchao, ZJU
(Yuquan Campus)

Tencent Meeting ID:
751 993 484
Research & Discussion Research & Discussion Prof Xie Lihua, NTU
(2.30pm-5.30pm)

Zoom Meeting ID:
499 292 8524

Registration and Deadlines

Registration:
Welcome to register (free) this event via the following link
https://jinshuju.net/f/KEOdJK

Deadlines:
Pre-registration: 28 July 2021;
Notification: 30 July 2021.

Organizers and Contact

This event is organized by Jinming Xu (ZJU), Wenchao Meng (ZJU), Chengcheng Zhao (ZJU) and Ying Sun (PSU) with the advisory board member Jiming Chen (ZJU), Peng Cheng (ZJU) and Gesualdo Scutari (Purdue). For inquiries, please write to jimmyxu AT zju.edu.cn.