×

Decentralized Federated Multi-Task
Learning and System Design
Federated Learning enables training models collaboratively over a large number of distributed edge devices without integrating their local data, while Federated Multi-Task Learning can further help to learn a personalized model for each device. However, they both pose particular statistical and systems challenges. To simultaneously address these two challenges, and focusing on training deep neural networks models collaboratively, we propose a decentralized approach with a multi-task framework and a new optimization algorithm called Decentralized Periodic Averaging SGD (DPA-SGD). We also developed a real-world decentralized federated learning system on a large-scale cluster (1024 CPU workers) to prove our multi-task learning framework and the DPA-SGD algorithm. We open source our system and it can promote further research on distributed learning and especially federated learning.
Decentralized
Federated
Multi-Task Learning
and System Design

Federated Learning enables training models collaboratively over a large number of distributed edge devices without integrating their local data, while Federated Multi-Task Learning can further help to learn a personalized model for each device. However, they both pose particular statistical and systems challenges. To simultaneously address these two challenges, and focusing on training deep neural networks models collaboratively, we propose a decentralized approach with a multi-task framework and a new optimization algorithm called Decentralized Periodic Averaging SGD (DPA-SGD). We also developed a real-world decentralized federated learning system on a large-scale cluster (1024 CPU workers) to prove our multi-task learning framework and the DPA-SGD algorithm. We open source our system and it can promote further research on distributed learning and especially federated learning.