MJ

Tensorlog

TensorLog (Cohen et al., 2017) is an implementation of probabilistic knowledge bases that (i) leverages deep learning to answer queries efficiently, by (ii) restricting the first order logic language to a class called p-tree-DKG, and (iii) learns weights on knowledge base facts.

(Read more)

Logic Tensor Networks

Logic Tensor Networks (Serafini & Garcez, 2016) is a framework for learning and reasoning combining neural-network capabilities with knowledge base data representation structures using first-order logic.

They address the problem of (i) query answering in the unit interval [0,1] (also known as fuzzy or real logic), as well as, (ii) subsymbolic (vector) representation finding of constant symbols, function symbols and predicates.

(Read more)

Neural Theorem Provers

This paper (Rocktäschel & Riedel, 2017) presents Neural Theorem Provers (NTP), a framework for fuzzy/real logic i.e. [0,1]-query answering with backwards chaining and subsymbolic representation learning.

(Read more)