###
上海电力大学学报:2022,38(2):183-188,198
本文二维码信息
码上扫一扫!
分布式深度学习的数据传输压缩机制研究综述
(上海电力大学 计算机科学与技术学院)
A Survey on Data Transmission Compression Mechanism of Distributed Deep Learning
(School of Computer Science and Technology, Shanghai University of Electric Power)
摘要
图/表
参考文献
本刊相似文献
All Journals 相似文献
All Journals 引证文献
本文已被:浏览 1038次   下载 948
投稿时间:2020-03-18    
中文摘要: 分布式深度学习已经成为深度学习领域最热门的研究之一,越来越庞大的数据模型和算法都依赖分而治之思想来处理。分布式深度学习系统在快速发展中遇到了许多挑战,其中最重要的是如何在分布式环境有限的网络带宽中有效地传输大量数据,以充分利用计算资源高效训练高精度的深度神经网络。针对这个问题,从参数同步、模型聚合和梯度压缩3个角度讨论不同的数据传输压缩技术,分别将其用于交错数据传输、减少数据交换频率以及减小单次传输数据量。最后对分布式深度学习的数据传输压缩机制的未来发展趋势进行了讨论和展望。
Abstract:Distributed deep learning has become one of the most popular research fields in the entire deep learning.Increasingly large amounts of data,models and algorithms will rely on the highly intelligent idea of divide and conquer to process.However,the distributed deep learning system has encountered many new challenges in the rapid development,the most important of which is the effective transmission of large amounts of data in the limited network bandwidth of the distributed environment,in order to make full use of computing resources to efficiently train high precision deep neural networks.In response to this problem,this article discusses different techniques of data transmission compression from three perspectives,including parameter synchronization,model aggregation,and gradient compression,which are respectively used for interleaving data transmission,reducing the frequency of data exchange,and the amount of data transmitted each time.The future development trend of distributed deep learning data transmission compression mechanism is discussed and prospected.
文章编号:202202013     中图分类号:TP183    文献标志码:
基金项目:
引用文本:
杜海舟,冯晓杰.分布式深度学习的数据传输压缩机制研究综述[J].上海电力大学学报,2022,38(2):183-188,198.
DU Haizhou,FENG Xiaojie.A Survey on Data Transmission Compression Mechanism of Distributed Deep Learning[J].Journal of Shanghai University of Electric Power,2022,38(2):183-188,198.