2020 年 2020 巻 AGI-016 号 p. 05-
In the viewpoint of the Bayesian brain hypothesis, Bayesian network model of cerebral cortex is promissing not only for computational modeling of brain, but also for an efficient brain- like artificial intelligence. A norious drawback in a Bayesian network is, however, the number of parameters that grows exponentially against the number of parent variables for a random variable. Restriction of the model may be a solution to this problem. Inspired by the biological plausibility, we previously proposed to use the combination of the noisy-OR and noisy-AND gates, whose numbers of parameters grow linearly with the number of parent random variables. Although we showed that this model can have translation invariance in a small-scale setting, it was difficult to enlarge the scale because of the hidden variables. In this study, we extend the previous attempt by employing a variational learning method to overcome the intractability of the estimation of the massive hidden variables. We can scale the model up to learn the hand-written digit data.