Faster rates for compressed federated learning with client-variance reduction
The new paper Faster rates for compressed federated learning with client-variance reduction has been out.
Our paper “Faster rates for compressed federated learning with client-variance reduction” has been out. Links:
I was glad to work with my peers Haoyu Zhao from Princeton University, Zhize Li, and prof. Peter Richtarik from King Abdullah University of Science and Technology.
We provide rigorous theory and a rich amount of practical experiments to highlight the benefits of our methods. In terms of practice, we are providing comparisons of several state-of-the-art optimization Federated Learning (FL) algorithms with theoretical and tunable parameters that control the behavior of optimization algorithms. The provided experiments include non-convex binary classification with convex and nonconvex logistic regression and image classifier ResNet-18@CIFAR10.
Our COFIG algorithm has demonstrated in honest comparison excellent results.
The experimental part for that paper has been done in an advanced research simulator for Federated Learning called FL_PyTorch.