MARINA Faster Non-Convex Distributed Learning with Compression on ACM PODC 2022


The paper MARINA: Faster Non-Convex Distributed Learning with Compression on Principles of Distributed Learning of ACM PODC 2022.


The ACM Symposium on Principles of Distributed Computing is an international forum on the theory, design, analysis, implementation, and application of distributed systems and networks. The symposium aims to improve understanding of the principles underlying distributed computing. The scope of the conference is pretty breadth, and it is described on its homepage: https://www.podc.org/.

The symposium will be held July 25 2022 - July 29 2022 in Salerno, Italy. The schedule includes Keynote Speakers, Sessions, Workshops, and Tutorials. The list of workshops and tutorials is the following:

It’s a great pleasure for me to obtain the opportunity to present our work MARINA: Faster Non-Convex Distributed Learning with Compression at the Principles of Distributed Learning (PODL) workshop of that symposium. MARINA employs a novel communication compression strategy based on the compression of gradient differences which is reminiscent but different from the technique used in the DIANA method of Mishchenko et al. (2019). Our methods are superior to previous state-of-the-art methods regarding the oracle/communication complexity.

Unfortunately, my visit is under question due to considerable time delays with obtaining a VISA to Italy due to the series of National Holidays in Saudi Arabia and long-time official delays in obtaining VISAs through dedicated organizations (3 working weeks after application excluding national holidays).

Thumbnail


Written on July 21, 2022