1. [ホーム]
  2. [研究業績]
  3. [研究業績詳細]

研究業績詳細

タイトル Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference
著者 飯窪祐二 、堀井俊佑 、松嶋敏泰
年度 2018
形式 国際学会
分野 知識情報処理
掲載雑誌名 Proceedings of the 2018 International Symposium on Information Theory and its Applications (ISITA2018)
掲載号・ページ pp.60--64, Singapore
掲載年 2018
掲載月 10
アブスト
(日本語)
Proceedings of the 2018 International Symposium on Information Theory and its Applications (ISITA2018)
2018年10月28日~31日
Singapore
査読有
DOI:なし
アブスト
(英語)
The hierarchical mixture of experts (HME) is a tree-structured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
備考
(日本語)
1
備考
(英語)
1
論文原稿
発表資料