Relation-aware Graph Attention Model with Adaptive Self-adversarial Training
Abstract
This paper describes an end-to-end solution for the relationship prediction task in heterogeneous, multi-relational graphs to particularly address two building blocks in the pipeline, namely representation learning and negative sampling. Existing message passing-based graph neural networks tend to ignore the semantics of the edges, that is the edge information is only used either for graph traversal and/or selection of encoding functions. Ignoring the edge semantics could have severe repercussions on the quality of embeddings, especially when dealing with two nodes having multiple relations. Moreover, the expressivity of the learned representation also depends on the quality of negative samples used during training. Although existing hard negative sampling techniques can identify challenging negative relationships for optimization, new techniques are required to control false negatives during training as false negatives could corrupt the learning process. To address these issues, we first propose RelGNN - a message passing-based heterogeneous graph attention model. In particular, RelGNN generates the states of different relations and leverages them along with the node states to weigh the messages. RelGNN also adopts a self-attention mechanism to balance the importance of attribute features and topological features for generating the final entity embeddings. Second, we introduce a parameter free negative sampling technique - adaptive self-adversarial (ASA) negative sampling. ASA reduces the false negative rate by leveraging positive relationships to effectively guide the identification of true negative samples. Our experimental evaluation demonstrates that the combination of RelGNN and ASA for relationship prediction improves state-of-the-art performance across established benchmarks as well as a real industrial dataset.