Gaoyuan Zhang, Songtao Lu, et al.
UAI 2022
Federated learning opens a number of research opportunities due to its high communication efficiency in distributed training problems within a star network. In this paper, we focus on improving the communication efficiency for fully decentralized federated learning (DFL) over a graph, where the algorithm performs local updates for several iterations and then enables communications among the nodes. In such a way, the communication rounds of exchanging the common interest of parameters can be saved significantly without loss of optimality of the solutions. Multiple numerical simulations based on large, real- world electronic health record databases showcase the superiority of the decentralized federated learning compared with classic methods.
Gaoyuan Zhang, Songtao Lu, et al.
UAI 2022
Minghong Fang, Zifan Zhang, et al.
CCS 2024
Heshan Fernando, Lisha Chen, et al.
ICASSP 2024
Shuai Ma, Fan Zhang, et al.
IEEE TWC