A consensus-based decentralized training algorithm for deep neural networks with communication compression

Research output: Contribution to journalArticlepeer-review

130 Downloads (Pure)

Abstract

Facing the challenge of distributed computing on processing large-scale data, this paper proposes a consensus-based decentralized training method with communication compression. First, the decentralized training method is designed based on the decentralized topology to reduce the communication burden on the busiest agent and avoid any agent revealing its locally stored data. The convergence of the decentralized training algorithm is then analyzed, which demonstrates that the decentralized trained model can reach the minimal empirical risk on the whole dataset, without the sharing of data samples. Furthermore, model compression combined with the error-compensated method is considered to reduce communication costs during the decentralized training process. At last, the simulation study shows that the proposed decentralized training with error-compensated communication compression is applicable for both IID and non-IID datasets, and exhibits much better performance than the local training method. Besides, the proposed algorithm with an appropriate compression rate shows comparable performance with decentralized training and centralized training, while saving a lot of communication costs.
Original languageEnglish
JournalNeurocomputing
DOIs
Publication statusPublished - 13 Jan 2021

Keywords

  • Distributed training
  • Consensus
  • Model compression
  • Neural network
  • Decentralized communication topology
  • Convergence

Fingerprint

Dive into the research topics of 'A consensus-based decentralized training algorithm for deep neural networks with communication compression'. Together they form a unique fingerprint.

Cite this