In Dynamic Bayesian Networks, time is considered discrete: In medical applications, a time step can correspond to, for example, one day. Existing temporal inference algorithms process each time step sequentially, making long-term predictions computationally expensive. We present an exact, GPU-optimizable approach exploiting symmetries over time for prediction queries, which constructs a matrix for the underlying temporal process in a preprocessing step. Additionally, we construct a vector for each query capturing the probability distribution at the current time step. Then, we time-warp into the future by matrix exponentiation. In our empirical evaluation, we show an order of magnitude speedup over the interface algorithm. The work-heavy preprocessing step can be done offline, and the runtime of prediction queries is significantly reduced. Therefore, we can handle application problems that could not be handled efficiently before.
In Dynamic Bayesian Networks, time is considered discrete: In medical applications, a time step can correspond to, for example, one day. Existing temporal inference algorithms process each time step sequentially, making long-term predictions computationally expensive. We present an exact, GPU-optimizable approach exploiting symmetries over time for prediction queries, which constructs a matrix for the underlying temporal process in a preprocessing step. Additionally, we construct a vector for each query capturing the probability distribution at the current time step. Then, we time-warp into the future by matrix exponentiation. In our empirical evaluation, we show an order of magnitude speedup over the interface algorithm. The work-heavy preprocessing step can be done offline, and the runtime of prediction queries is significantly reduced. Therefore, we can handle application problems that could not be handled efficiently before.