Jihun Yun, Peng Zheng, et al.
ICML 2019
Bayesian nonparametric models provide a principled way to automatically adapt the complexity of a model to the amount of the data available, but computation in such models is difficult. Amortized variational approximations are appealing because of their computational efficiency, but current methods rely on a fixed finite truncation of the infinite model. This truncation level can be difficult to set, and also interacts poorly with amortized methods due to the over-pruning problem. Instead, we propose a new variational approximation, based on a method from statistical physics called Russian roulette sampling. This allows the variational distribution to adapt its complexity during inference, without relying on a fixed truncation level, and while still obtaining an unbiased estimate of the gradient of the original variational objective. We demonstrate this method on infinite sized variational auto-encoders using a Beta-Bernoulli (Indian buffet process) prior.
Jihun Yun, Peng Zheng, et al.
ICML 2019
Kenneth L. Clarkson, Ruosong Wang, et al.
ICML 2019
Marten van Dijk, Lam Nguyen, et al.
ICML 2019
Kubilay Atasu, Thomas Mittelholzer
ICML 2019