Inproceedings,

Tractable Probabilistic Graph Representation Learning with Graph-Induced Sum-Product Networks

, and .
Proceedings of the International Conference on Learning Representations(ICLR 2024), May 7-11, 2024, Austria, ICLR, (May 2024)
DOI: 10.48550/arXiv.2305.10544

Abstract

We introduce Graph-Induced Sum-Product Networks (GSPNs), a new probabilistic framework for graph representation learning that can tractably answer probabilistic queries. Inspired by the computational trees induced by vertices in the context of message-passing neural networks, we build hierarchies of sum-product networks (SPNs) where the parameters of a parent SPN are learnable transformations of the a-posterior mixing probabilities of its children's sum units. Due to weight sharing and the tree-shaped computation graphs of GSPNs, we obtain the efficiency and efficacy of deep graph networks with the additional advantages of a probabilistic model. We show the model's competitiveness on scarce supervision scenarios, under missing data, and for graph classification in comparison to popular neural models. We complement the experiments with qualitative analyses on hyper-parameters and the model's ability to answer probabilistic queries.

Tags

Users

  • @joy
  • @mls
  • @ki

Comments and Reviews