MJ

Statistical relational learning

Statistical relational learning (Koller et al., 2007) is a branch of artificial intelligence (AI) devoted to integrate research in probability theory, statistics, logics and relational learning. Its main purpose is to develop learning models that handle uncertain information extracted from real world scenarios, and produce structured representations that describe objects, attributes and their relations.

(Read more)

Markov Logic Networks

Markov Logic Networks (Richardson & Domingos, 2006), a.k.a. MLN, is a representation and query answering framework that combines probability and first-order logic.

(Read more)

Bayesian and Markov networks

In this article I will briefly explain and sketch the ideas behind Bayesian and Markov networks. These networks model joint probability distributions through graphs.

(Read more)

Implicitly Learning to Reason in First-Order Logic

This paper (Belle & Juba, 2019) considers the problem of answering queries about infinitary first order logic (infinite but countable amount of constants) formulas based on background knowledge partially represented explicitly as other formulas, and partially represented as examples independently drawn from a fixed probability distribution.

(Read more)

How Powerful are Graph Neural Networks?

Graph neural networks (GNNs) is a framework that allows to learn representations in a graph. This paper (Xu et al., 2018) discusses previous architectures, analyses their power of representation and presents their own GNN which is maximally powerful under certain constraints.

(Read more)