Translating on Pairwise Entity Space for Knowledge Graph Embedding

Yu Wu, Tingting Mu, John Y. Goulermas

Research output: Contribution to journalArticlepeer-review

217 Downloads (Pure)


In addition to feature-based representations that characterize objects with feature vectors, relation-based representations constitute another type of data representation strategies. They typically store patterns as a knowledge graph (KG), consisting of nodes (objects) and edges (relationships between objects). Given that most KGs are noisy and far from being complete, KG analysis and completion is required to establish the likely truth of new facts and correct unlikely ones based on the existing data within the KG. An effective way for tackling this, is through translation techniques which encode entities and links with hidden representations in embedding spaces. In this paper, we aim at improving the state-of-the-art translation techniques by taking into account the multiple facets of the different patterns and behaviors of each relation type. To the best of our knowledge, this is the first latent representation model which considers relational representations to be dependent on the entities they relate. The multi-modality of the relation type over different entities is effectively formulated as a projection matrix over the space spanned by the entity vectors. We develop an economic computation of the projection matrix by directly providing an analytic formulation other than relying on a more consuming iterative optimization procedure. Two large benchmark knowledge bases are used to evaluate the performance with respect to the link prediction task. A new test data partition scheme is proposed to offer better understanding of the behavior of a link prediction model. Experimental results show that the performance of the proposed algorithm is consistently among the top under different evaluation schemes.
Original languageEnglish
Early online date10 May 2017
Publication statusPublished - 2017


  • Statistical relational learning
  • link prediction
  • Knowledge graphs
  • Hidden representation
  • Embedding space


Dive into the research topics of 'Translating on Pairwise Entity Space for Knowledge Graph Embedding'. Together they form a unique fingerprint.

Cite this