Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling


The paper Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling was accepted by the Transactions on Machine Learning Research.


The paper “Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling” was accepted by the Transactions on Machine Learning Research (TMLR). TMLR is a venue for dissemination of machine learning research which emphasizes technical correctness over subjective significance.

Our paper improves the optimal first-order non-convex optimization method PAGE in three ways.

I was glad to work with my peers from the King Abdullah University of Science and Technology and hope for further cooperation with them:

Links to the paper:

Presentation dedicated for original PAGE - Oral in ICML 2021:


Accepted papers to TMLR can be browsed directly from OpenReview.



Written on October 12, 2023