4
$\begingroup$

I am interested in embedding a Knowledge Graph in a vector space jointly with logical rules.

Through Google, the best work I have found about that is Tim Rocktäschel's paper Injecting Logical Background Knowledge into Embeddings for Relation Extraction.

The problem is that there is no throughly explanation about the matrix factorization in the paper or in Tim Rocktäschel's thesis or lectures.

My question is: What is the meaning of the rows of the matrix P x k and the columnns of k x R?

$\endgroup$

1 Answer 1

2
$\begingroup$

I would encourage you to start with this survey paper by Nickel et al.: https://arxiv.org/abs/1503.00759

Recent notable models:

  • ComplEx-N3: Lacroix et al. Canonical Tensor Decomposition for Knowledge Base Completion. 2018.
  • HypER: Balazevic et al. Hypernetwork Knowledge Graph Embeddings. 2018.
  • TuckER: Balazevic et al. TuckER: Tensor Factorization for Knowledge Graph Completion. 2019.
  • M3GM: Pinter and Eisenstein. Predicting Semantic Relations using Global Graph Properties. 2018.

Regarding your particular question, the Pxk matrix contains k-dimensional row vectors, one for every entity-pair in the knowledge base (P many) and the kxR matrix contains k-dimensional column vectors, one for every binary relation in the knowledge base. Note though that the paper you referring to here is already an extension over traditional knowledge graph embedding methods in that it assumes additional first-order logic background knowledge.

$\endgroup$
1
  • $\begingroup$ I have more interesting questions for you: Will you apply your ideas in your above cited paper to answer RDF (description logic) queries? Can a description logic query be put in matrix form? $\endgroup$ Commented Sep 13, 2019 at 17:44

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.