Conference paper Open Access

Scalable Structure Learning of Continuous-Time Bayesian Networks from Incomplete Data

Linzner, Dominik; Schmidt, Michael; Koeppl, Heinz

Continuous-time Bayesian Networks (CTBNs) represent a compact yet powerful framework for understanding multivariate time-series data. Given complete data, parameters and structure can be estimated efficiently in closed-form. However, if data is incomplete, the latent states of the CTBN have to be estimated by laboriously simulating the intractable dynamics of the assumed CTBN. This is a problem, especially for structure learning tasks, where this has to be done for each element of a super-exponentially growing set of possible structures. In order to circumvent this notorious bottleneck, we develop a novel gradient-based approach to structure learning. Instead of sampling and scoring all possible structures individually, we assume the generator of the CTBN to be composed as a mixture of generators stemming from different structures. In this framework, structure learning can be performed via a gradient-based optimization of mixture weights. We combine this approach with a new variational method that allows for a closed-form calculation of this mixture marginal likelihood. We show the scalability of our method by learning structures of previously inaccessible sizes from synthetic and real-world data.

Files (714.5 kB)
30
22
views
downloads
All versions This version
Views 3030
Downloads 2222
Data volume 15.7 MB15.7 MB
Unique views 2323
Unique downloads 2020

Share

Cite as