Additional Insights on Federated Learning is Better with Non-Homomorphic Encryption


Additional Insights on the paper “Federated Learning is Better with Non-Homomorphic EncryptionPublished on 05 December 2023


Homomorphic Encryption has been a highly discussed topic in recent research, particularly regarding its potential use in privacy-preserving Federated Learning (FL). With my peers, Abdulmajeed Alrowith, Fahad Ali Albalawi from Saudi Data and AI Authority and Prof. Peter Richtárik from King Abdullah University of Science and Technology (ranked #1 in the Arabic World in 2023, 2024) we have discovered an intriguing way connect discrete math cryptographic primitives and continuous optimization primitives.

The result of our discovery is the work, titled “Federated Learning is Better with Non-Homomorphic Encryption,” which was accepted as part of the technical program for the 4th International Workshop on Distributed Machine Learning (DistributedML 2023), co-located with ACM CoNext on 8th December 2023.


Comment 1: Significant Memory Savings in Secure Communication

With the same security level our work practically outperforms Cheon-Kim-Kim-Song(CKKS schema) in terms of communicated information from client to master and from master to client by factor x500.


Comment 2: Reevaluating the Unfeasibility of Classical Ciphers in FL

Our work opens up new possibilities for applying Classical Cryptography to Federated Learning (FL) and challenges existing claims about its limitations made by various scientists:


Comment 3: About Appendix of the Paper

Our paper includes an Appendix providing additional details, including:

  • The effect of the optimization problem’s dimensionality
  • Techniques for overlapping communication and computation during model training
  • Deployment strategies for our framework across various network topologies
  • The framework’s connections to and influence on other aspects of end-to-end deep learning training


Written on December 20, 2024