人工知能学会第二種研究会資料
Online ISSN : 2436-5556
Noisy-OR, Noisy-AND ゲートによる位置不変性の変分学習
佐野 崇一杉 裕志
著者情報
研究報告書・技術報告書 フリー

2020 年 2020 巻 AGI-016 号 p. 05-

詳細
抄録

In the viewpoint of the Bayesian brain hypothesis, Bayesian network model of cerebral cortex is promissing not only for computational modeling of brain, but also for an efficient brain- like artificial intelligence. A norious drawback in a Bayesian network is, however, the number of parameters that grows exponentially against the number of parent variables for a random variable. Restriction of the model may be a solution to this problem. Inspired by the biological plausibility, we previously proposed to use the combination of the noisy-OR and noisy-AND gates, whose numbers of parameters grow linearly with the number of parent random variables. Although we showed that this model can have translation invariance in a small-scale setting, it was difficult to enlarge the scale because of the hidden variables. In this study, we extend the previous attempt by employing a variational learning method to overcome the intractability of the estimation of the massive hidden variables. We can scale the model up to learn the hand-written digit data.

著者関連情報
© 2020 著作者
前の記事 次の記事
feedback
Top