Knowledge Hypergraph Link Prediction Model Based on Tensor Decomposition
-
-
Abstract
Knowledge hypergraphs contain facts in the real world and provide a structured representation of these facts, but they cannot include all facts. They are highly incomplete. Link prediction approaches aim at inferring missing links based on existing links between entities, so they are widely used in knowledge base completion. At present, most researches focus on the completion of the binary relational knowledge graphs. However, the relations between entities in the real world usually go beyond pairwise associations, that is, there are more than two entities involved in a relation. Compared with knowledge graphs, knowledge hypergraphs can represent these complex n-ary relations in a flexible and natural way. Therefore, we propose a knowledge hypergraph link prediction model based on tensor decomposition, called Typer. It explicitly models the roles of entities in different relations and positions, and decomposes the relations to improve the performance. Meanwhile, considering that promoting the information flow between entities and relations is helpful for learning embeddings of entities and relations, we propose the concept of windows to increase the interaction between entities and relations. In addition, we prove Typer is fully expressive and derive a bound on the dimensionality of its embeddings for full expressivity. We conduct extensive experiments on multiple public real-world knowledge hypergraph datasets. Experiments show that Typer is effective for link prediction in knowledge hypergraphs and achieves better results on all benchmark datasets than other approaches.
-
-