Local-to-global Bayesian network structure learning
Tian Gao, Kshitij Fadnis, et al.
ICML 2017
Hidden units can play essential roles in modeling time-series having long-term dependency or nonlinearity but make it difficult to learn associated parameters. Here we propose a way to learn such a time-series model by training a backward model for the time-reversed time-series, where the backward model has a common set of parameters as the original (forward) model. Our key observation is that only a subset of the parameters is hard to learn, and that subset is complementary between the forward model and the backward model. By training both of the two models, we can effectively learn the values of the parameters that are hard to learn if only either of the two models is trained. We apply bidirectional learning to a dynamic Boltzmann machine extended with hidden units. Numerical experiments with synthetic and real datascts clearly demonstrate advantages of bidirectional learning.
Tian Gao, Kshitij Fadnis, et al.
ICML 2017
Takashi Imamichi, Takayuki Osogami, et al.
IJCAI 2016
Shohei Ohsawa, Kei Akuzawa, et al.
ICLR 2018
Taro Sekiyama, Atsushi Igarashi
POPL 2017