This paper was accepted for presentation at the International Workshop on Federated Foundation Models (FL@FM-NeurIPS’24), held in conjunction with NeurIPS 2024.
Asynchronous protocols have been shown to improve the scalability of federated learning (FL) with a massive number of clients. Meanwhile, momentum-based methods can achieve the best model quality in synchronous FL. However, naively applying momentum in asynchronous FL algorithms leads to slower convergence and degraded model performance. It is still unclear how to effective combinie these two techniques together to achieve a win-win…Apple Machine Learning Research