DualTKB: A Dual Learning Bridge between Text and Knowledge Base
Abstract
In this work, we present a dual learning ap- proach for unsupervised text to path and path to text transfers in Commonsense Knowledge Bases (KBs). We investigate the impact of weak supervision by creating a weakly su- pervised dataset and show that even a slight amount of supervision can significantly im- prove the model performance and enable better-quality transfers. We examine different model architectures, and evaluation metrics, proposing a novel Commonsense KB comple- tion metric tailored for generative models. Ex- tensive experimental results show that the pro- posed method compares very favorably to the existing baselines. This approach is a viable step towards a more advanced system for au- tomatic KB construction/expansion and the re- verse operation of KB conversion to coherent textual descriptions.