TY - GEN
T1 - Regularizing Knowledge Graph Embeddings via Equivalence and Inversion Axioms
AU - Minervini, Pasquale
AU - Costabello, Luca
AU - Muñoz, Emir
AU - Nováček, Vít
AU - Vandenbussche, Pierre Yves
N1 - Publisher Copyright:
© 2017, Springer International Publishing AG.
PY - 2017
Y1 - 2017
N2 - Learning embeddings of entities and relations using neural architectures is an effective method of performing statistical learning on large-scale relational data, such as knowledge graphs. In this paper, we consider the problem of regularizing the training of neural knowledge graph embeddings by leveraging external background knowledge. We propose a principled and scalable method for leveraging equivalence and inversion axioms during the learning process, by imposing a set of model-dependent soft constraints on the predicate embeddings. The method has several advantages: (i) the number of introduced constraints does not depend on the number of entities in the knowledge base; (ii) regularities in the embedding space effectively reflect available background knowledge; (iii) it yields more accurate results in link prediction tasks over non-regularized methods; and (iv) it can be adapted to a variety of models, without affecting their scalability properties. We demonstrate the effectiveness of the proposed method on several large knowledge graphs. Our evaluation shows that it consistently improves the predictive accuracy of several neural knowledge graph embedding models (for instance, the MRR of TransE on WordNet increases by 11%) without compromising their scalability properties.
AB - Learning embeddings of entities and relations using neural architectures is an effective method of performing statistical learning on large-scale relational data, such as knowledge graphs. In this paper, we consider the problem of regularizing the training of neural knowledge graph embeddings by leveraging external background knowledge. We propose a principled and scalable method for leveraging equivalence and inversion axioms during the learning process, by imposing a set of model-dependent soft constraints on the predicate embeddings. The method has several advantages: (i) the number of introduced constraints does not depend on the number of entities in the knowledge base; (ii) regularities in the embedding space effectively reflect available background knowledge; (iii) it yields more accurate results in link prediction tasks over non-regularized methods; and (iv) it can be adapted to a variety of models, without affecting their scalability properties. We demonstrate the effectiveness of the proposed method on several large knowledge graphs. Our evaluation shows that it consistently improves the predictive accuracy of several neural knowledge graph embedding models (for instance, the MRR of TransE on WordNet increases by 11%) without compromising their scalability properties.
UR - http://www.scopus.com/inward/record.url?scp=85040231166&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-71249-9_40
DO - 10.1007/978-3-319-71249-9_40
M3 - Contribución a la conferencia
AN - SCOPUS:85040231166
SN - 9783319712482
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 668
EP - 683
BT - Machine Learning and Knowledge Discovery in Databases - European Conference, ECML PKDD 2017, Proceedings
A2 - Ceci, Michelangelo
A2 - Dzeroski, Saso
A2 - Vens, Celine
A2 - Todorovski, Ljupco
A2 - Hollmen, Jaakko
PB - Springer Verlag
T2 - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2017
Y2 - 18 September 2017 through 22 September 2017
ER -