Published January 21, 2020 | Version v1
Conference paper Open

Scalable Structure Learning of Continuous-Time Bayesian Networks from Incomplete Data

  • 1. Technische Universität Darmstadt

Description

Continuous-time Bayesian Networks (CTBNs) represent a compact yet powerful framework for understanding multivariate time-series data. Given complete data, parameters and structure can be estimated efficiently in closed-form. However, if data is incomplete, the latent states of the CTBN have to be estimated by laboriously simulating the intractable dynamics of the assumed CTBN. This is a problem, especially for structure learning tasks, where this has to be done for each element of a super-exponentially growing set of possible structures. In order to circumvent this notorious bottleneck, we develop a novel gradient-based approach to structure learning. Instead of sampling and scoring all possible structures individually, we assume the generator of the CTBN to be composed as a mixture of generators stemming from different structures. In this framework, structure learning can be performed via a gradient-based optimization of mixture weights. We combine this approach with a new variational method that allows for a closed-form calculation of this mixture marginal likelihood. We show the scalability of our method by learning structures of previously inaccessible sizes from synthetic and real-world data.

Files

8631-scalable-structure-learning-of-continuous-time-bayesian-networks-from-incomplete-data.pdf

Additional details

Funding

iPC – individualizedPaediatricCure: Cloud-based virtual-patient models for precision paediatric oncology 826121
European Commission