YorkSpace has migrated to a new version of its software. Access our Help Resources to learn how to use the refreshed site. Contact diginit@yorku.ca if you have any questions about the migration.
 

Elastic Synchronization for Efficient and Effective Distributed Deep Learning

dc.contributor.advisorAn, Aijun
dc.contributor.authorZhao, Xing
dc.date.accessioned2020-11-13T13:54:09Z
dc.date.available2020-11-13T13:54:09Z
dc.date.copyright2020-07
dc.date.issued2020-11-13
dc.date.updated2020-11-13T13:54:09Z
dc.degree.disciplineComputer Science
dc.degree.levelMaster's
dc.degree.nameMSc - Master of Science
dc.description.abstractTraining deep neural networks (DNNs) using a large-scale cluster with an efficient distributed paradigm significantly reduces the training time. However, a distributed paradigm developed only from system engineering perspective is most likely to hindering the model from learning due to the intrinsic optimization properties of machine learning. In this thesis, we present two efficient and effective models in the parameter server setting based on the limitations of the state-of-the-art distributed models such as staleness synchronous parallel (SSP) and bulk synchronous parallel (BSP). We introduce DynamicSSP model that adds smart dynamic communication to SSP, improves its communication efficiency and replaces its fixed staleness threshold with a dynamic threshold. DynamicSSP converges faster and to a higher accuracy than SSP in the heterogeneous environment. Having recognized the importance of bulk synchronization in training, we propose the ElasticBSP model which shares the proprieties of bulk synchronization and elastic synchronization. We develop fast online optimization algorithms with look-ahead mechanisms to materialise ElasticBSP. Empirically, ElasticBSP achieves the convergence speed 1.77 times faster and an overall accuracy 12.6% higher than BSP.
dc.identifier.urihttp://hdl.handle.net/10315/37937
dc.languageen
dc.rightsAuthor owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.
dc.subjectComputer science
dc.subject.keywordsDistributed Deep Learning
dc.subject.keywordsBSP
dc.subject.keywordsASP
dc.subject.keywordsSSP
dc.subject.keywordsSGD
dc.subject.keywordsOptimization
dc.titleElastic Synchronization for Efficient and Effective Distributed Deep Learning
dc.typeElectronic Thesis or Dissertation

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zhao_Xing_2020_Masters.pdf
Size:
6.34 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
license.txt
Size:
1.83 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
YorkU_ETDlicense.txt
Size:
3.36 KB
Format:
Plain Text
Description: