Control Flow Operators in PyTorch
Yidi Wu, Thomas Bohnstingl, et al.
ICML 2025
In this talk, we introduce an asynchronous decentralized accelerated stochastic gradient descent algorithm for decentralized stochastic optimization. Considering communication and synchronization costs are the major bottlenecks, we attempt to reduce these costs via randomization techniques. Our major contribution is to develop a class of accelerated randomized decentralized algorithms for solving general convex composite problems. We establish O(1/ε) (resp., O(1/√ε)) communication complexity and O(1/ε2) (resp., O(1/ε)) sampling complexity for solving general convex (resp., strongly convex) problems. It worths mentioning that our proposing algorithm only sublinear depends on the Lipschitz constant if there is a smooth component presented in the objective function.
Yidi Wu, Thomas Bohnstingl, et al.
ICML 2025
Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Natalia Martinez Gil, Dhaval Patel, et al.
UAI 2024
Shubhi Asthana, Pawan Chowdhary, et al.
KDD 2021