home..

GQE

2022.7.5 Embedding Logical Queries on Knowledge Graphs

2022.7.5 Embedding Logical Queries on Knowledge Graphs

Abstract

• 这个领域的主要challenge:

go beyond simple edge prediction and handle more complex logical queries, which might involve multiple unobserved edges, entities , and variables? variable在这指什么.

• 这篇文章解决的问题:

efficiently make predictions about conjunctive logical queries on incomplete knowledge graphs.

• 解决方法:

nodes $--^{embeding}-->$ elements in low-dimensional space
logical operators $--^{embeding}-->$ geometric operations in this embedding space(比如平移、旋转之类的)

• results:

time complexity –> linear in the number of query variables
enumeration-based approach(? 枚举在这里具体指什么 ? searching for subgraph)时间复杂度是指数形式

1. Introduction

• fundamental task:

discover or predict unobserved edges using this graph-structured data

link prediction
recommender systems
knowledge base completion
(goal:predict unobserved edges between nodes in a graph using an observed set of training edges)

challenge: make predictions about more complex graph queries that involve multiple unobserved edges, nodes, and even variables—rather than just single edges

• naive approach and its problem

问题可被归结为找subgraph

approach:
1- 预测边- run an edge prediction model on all possible pairs of nodes,
2- score所有可能满足query的子图- enumerate and score all candidate subgraphs that might satisfy a query.

问题:
复杂度是e指数

• graph query embeddings (GQEs)

1- embed graph nodes in a low-dimensional space

2- represent logical operators as learned geometric operations

• Logical reasoning and knowledge graphs

• Probabilistic databases

• Neural theorem proving

3. Background and Preliminaries

$\mathcal{G}=(\mathcal{V},\mathcal{E}) ,$ $v\in\mathcal{V}, e\in\mathcal{E}$

$e=\tau(u,v), u,v\in\mathcal{V}$ are nodes with types $\gamma_1, \gamma_2\in\Gamma$ , $\tau : \gamma_1\times\gamma_2\rightarrow{true,false}$

3.1 Conjunctive graph queries

query 的形式:
$q=V_?. \exist V_1,…,V_m: e_1\land e_2\land…\land e_n$

例如

1.png 2.png

4. Proposed Approach

key idea : learn how to embed any conjunctive graph query into a low-dimensional space
achieved by representing logical query operations as geometric operators that are jointly optimized on a low-dimensional embedding space along with a set of node embeddings.

node embedding和query operation embedding一起optimize

• algorithm:

3.png

Geometric projection operator, $\mathcal{P}$:

$q–^{\mathcal{P}(q,\tau)}–> q’$

$q’$ :new query embedding

$[q’]$ 的denotation是 $q$ 以 $\tau$ 这个类型的edge连接的nodes的集合

$\mathcal{P}(\mathsf{q},\tau)=\mathsf{R_{\tau}q}$ , 其中 $\mathsf{R_{\tau}}$ 是一个trainable matrix(d*d维)

? query被embed到single point上去

Geometric intersection operator, $\mathcal{I}$:

$\mathcal{I}({q_1,…,q_n})=W_k\Psi(NN_k(q_i))$

4.1 Theoretical analysis

conjunctive queries $\iff$ sequences of geometric projection and intersection operations in an embedding space

4.2 Node embeddings

standard “bag-of-features” approach:

$z_u=\frac{Z_\gamma x_u}{ x_u }$

4.3 Other variants of our framework

P,I的形式可以变化

4.4 model training

4.png

© 2023 huyi   •  Powered by Soopr   •  Theme  Moonwalk