1. [ホーム]
  2. [研究業績]
  3. [研究業績詳細]

研究業績詳細

タイトル Distributed Stochastic Gradient Descent Using LDGM Codes
著者 堀井 俊佑 、吉田 隆弘 、小林 学 、松嶋 敏泰
年度 2019
形式 国際学会
分野 その他
掲載雑誌名 Proceedings of 2019 IEEE International Symposium on Information Theory (ISIT2019)
掲載号・ページ pp.1417-1421
掲載年 2019
掲載月 7
アブスト
(日本語)
2019 IEEE International Symposium on Information Theory (ISIT2019)
2019年7月7日~12日(発表日:10日)
Paris, France
査読有
DOI: 10.1109/ISIT.2019.8849580
アブスト
(英語)
We consider a distributed learning problem in which the computation is carried out on a system consisting of a master node and multiple worker nodes. In such systems, the existence of slow-running machines called stragglers will cause a significant decrease in performance. Recently, coding theoretic framework, which is named Gradient Coding (GC), for mitigating stragglers in distributed learning has been established by Tandon et al. Most studies on GC are aiming at recovering the gradient information completely assuming that the Gradient Descent (GD) algorithm is used as a learning algorithm. On the other hand, if the Stochastic Gradient Descent (SGD) algorithm is used, it is not necessary to completely recover the gradient information, and its unbiased estimator is sufficient for the learning. In this paper, we propose a distributed SGD scheme using Low Density Generator Matrix (LDGM) codes. In the proposed system, it may take longer time than existing GC methods to recover the gradient information completely, however, it enables the master node to obtain a high-quality unbiased estimator of the gradient at low computational cost and it leads to overall performance improvement.
備考
(日本語)
1
備考
(英語)
1
論文原稿
発表資料