# Dynamic bayesian networks: representation, inference and learning

@inproceedings{Murphy2002DynamicBN, title={Dynamic bayesian networks: representation, inference and learning}, author={Kevin P. Murphy and Stuart J. Russell}, year={2002} }

Dynamic Bayesian Networks: Representation, Inference and Learning by Kevin Patrick Murphy Doctor of Philosophy in Computer Science University of California, Berkeley Professor Stuart Russell, Chair Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and bio-sequence analysis, and KFMs have been… Expand

#### Figures, Tables, and Topics from this paper

figure 1.1 figure 1.3 figure 1.4 figure 2.10 figure 2.11 figure 2.12 figure 2.13 figure 2.14 figure 2.15 figure 2.16 figure 2.17 figure 2.18 figure 2.19 figure 2.2 figure 2.20 figure 2.21 figure 2.22 figure 2.23 figure 2.24 figure 2.25 figure 2.26 figure 2.27 figure 2.28 figure 2.29 figure 2.3 figure 2.30 figure 2.31 figure 2.32 figure 2.33 figure 2.34 figure 2.35 figure 2.36 figure 2.4 figure 2.5 figure 2.6 figure 2.8 figure 2.9 figure 3.1 table 3.1 figure 3.10 figure 3.11 figure 3.12 figure 3.13 figure 3.14 figure 3.15 figure 3.16 figure 3.17 figure 3.19 figure 3.2 figure 3.20 figure 3.21 figure 3.22 figure 3.3 figure 3.4 figure 3.5 figure 3.6 figure 3.7 figure 3.8 figure 3.9 figure 4.1 figure 4.10 figure 4.11 figure 4.12 figure 4.13 figure 4.14 figure 4.15 figure 4.16 figure 4.2 figure 4.3 figure 4.4 figure 4.5 figure 4.6 figure 4.7 figure 4.8 figure 4.9 figure 5.1 figure 5.10 figure 5.11 figure 5.12 figure 5.13 figure 5.14 figure 5.15 figure 5.3 figure 5.4 figure 5.5 figure 5.6 figure 5.7 figure 5.8 figure 5.9 figure 6.1 figure 6.10 figure 6.11 figure 6.12 figure 6.13 figure 6.14 figure 6.15 figure 6.16 figure 6.17 figure 6.18 figure 6.19 figure 6.2 figure 6.3 figure 6.4 figure 6.5 figure 6.6 figure 6.7 figure 6.8 figure 6.9 figure A.1 table A.1 figure A.10 figure A.11 figure A.12 figure A.2 table A.2 figure A.3 figure A.4 figure A.5 figure A.6 figure A.7 figure A.8 figure A.9 figure B.1 figure B.10 figure B.11 figure B.12 figure B.14 figure B.15 figure B.16 figure B.17 figure B.18 figure B.19 figure B.2 figure B.20 figure B.21 figure B.22 figure B.3 figure B.4 figure B.5 figure B.6 figure B.7 figure B.8 figure B.9 table C.1 figure C.1 table C.2 figure C.2 table C.3 figure C.3 figure C.7 table D.1 table D.3 table D.4 table D.6

#### 2,727 Citations

Statistical Inference in Graphical Models

- Computer Science
- 2008

The mathematical foundations of graphical models and statistical inference are described, focusing on the concepts and techniques that are most useful to the problem of decision making in dynamic systems under uncertainty. Expand

Efficient Inference For Hybrid Bayesian Networks

- Computer Science
- 2007

This dissertation focuses on the hybrid Bayesian networks containing both discrete and continuous random variables and presents an approximate analytical method to estimate the performance bound, which can help the decision maker to understand the prediction performance of a BN model without extensive simulation. Expand

Supervised Learning in Dynamic Bayesian Networks

- Computer Science
- NIPS 2014
- 2014

This work derives supervised learning algorithms for parameter estimation and inference of latent variables in two commonly used DBN models for such time series, namely the switching vector autoregressive (SVAR) model and the switching Kalman filter (SKF). Expand

Bayesian networks for mathematical models: Techniques for automatic construction and efficient inference

- Computer Science
- Int. J. Approx. Reason.
- 2013

By incorporating knowledge in the form of an existing ODE model, a DBN framework is built for efficiently predicting individualised patient responses using the available bedside and lab data. Expand

Approximate inference for dynamic Bayesian networks: sliding window approach

- Computer Science
- Applied Intelligence
- 2013

A sliding window framework for approximate inference in DBNs to reduce the computational burden by introducing a sliding window that moves forward as time progresses, which means inference at any time is restricted to a quite narrow region of the network. Expand

Directly Learning Tractable Models for Sequential Inference and DecisionMaking

- Computer Science
- 2016

Two new probabilistic graphical models are presented: Dynamic Sum-product Networks (DynamicSPNs) and Decision Sum-Product-Max Networks (DecisionSPMNs), where the former is suitable for problems with sequence data of varying length and the latter is for problemsWith random, decision, and utility variables. Expand

Learning the structure of dynamic Bayesian networks from time series and steady state measurements

- Computer Science
- Machine Learning
- 2008

Simulation results demonstrate that dynamic network structures can be learned to an extent from steady state measurements alone and that inference from a combination of steady state and time series data has the potential to improve learning performance relative to the inference from time seriesData alone. Expand

Approximated Probabilistic Inference on a Dynamic Bayesian Network Using a Multistate Neural Network

- Computer Science
- 2014

This work proposes a new heuristic algorithm for probabilistic inference on the DBN using a multistate neural network, which supports a bottom-up error-reporting mechanism against top-down predictions. Expand

A New Algorithm for Modeling and Inferring User's Knowledge by Using Dynamic Bayesian Network

- Computer Science
- 2014

The new algorithm is proposed that both the size of DBN and the number of Conditional Probability Tables (CPT) in DBN are kept intact (not changed) when the process continues for a long time and solves the problem of temporary slip and lucky guess. Expand

Dynamic Bayesian Networks

- Computer Science
- 2002

This chapter considers more complex models of sequential data, and focuses on dynamic Bayesian networks, which can be applied to temporal models, but can also be used for sequential learning of static models, which is useful if the data is non-stationary or too large for batch methods. Expand

#### References

SHOWING 1-10 OF 444 REFERENCES

Constant-space reasoning in dynamic Bayesian networks

- Computer Science, Mathematics
- Int. J. Approx. Reason.
- 2001

One of the main algorithms for achieving constant-space complexity of Dynamic Bayesian networks is studied, based on “slice-by-slice” elimination orders, and improvements on it are suggested based on new classes of elimination orders. Expand

Speech Recognition with Dynamic Bayesian Networks

- Computer Science
- AAAI/IAAI
- 1998

This thesis shows that dynamic Bayesian networks can be used effectively in the field of automatic speech recognition, and presents inference routines that are especially tailored to the requirements of speech recognition: efficient inference with deterministic constraints, variable-length utterances, and online inference. Expand

Probabilistic Independence Networks for Hidden Markov Probability Models

- Computer Science, Medicine
- Neural Computation
- 1997

It is shown that the well-known forward-backward and Viterbi algorithms for HMMs are special cases of more general inference algorithms for arbitrary PINs and the existence of inference and estimation algorithms for more general graphical models provides a set of analysis tools for HMM practitioners who wish to explore a richer class of HMM structures. Expand

Approximate Learning of Dynamic Models

- Computer Science
- NIPS
- 1998

It is shown empirically that, for a real-life domain, EM using the authors' inference algorithm is much faster than EM using exact inference, with almost no degradation in quality of the learned model. Expand

Adaptive Probabilistic Networks with Hidden Variables

- Mathematics, Computer Science
- Machine Learning
- 2004

This paper presents a gradient-based algorithm and shows that the gradient can be computed locally, using information that is available as a byproduct of standard inference algorithms for probabilistic networks. Expand

Factorial Hidden Markov Models

- Mathematics, Computer Science
- Machine Learning
- 2004

A generalization of HMMs in which this state is factored into multiple state variables and is therefore represented in a distributed manner, and a structured approximation in which the the state variables are decoupled, yielding a tractable algorithm for learning the parameters of the model. Expand

Gaussian Process Networks

- Computer Science, Mathematics
- UAI
- 2000

The Bayesian score of Gaussian Process Networks is developed and described how to learn them from data and empirical results on artificial data as well as on real-life domains with non-linear dependencies are presented. Expand

Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks

- Computer Science, Mathematics
- UAI
- 2000

It is shown that Rao-Blackwellised particle filters (RBPFs) lead to more accurate estimates than standard PFs, and are demonstrated on two problems, namely non-stationary online regression with radial basis function networks and robot localization and map building. Expand

Inference in Hybrid Networks: Theoretical Limits and Practical Algorithms

- Computer Science, Mathematics
- UAI
- 2001

This paper proves that even if the CLG is restricted to an extremely simple structure of a polytree, the inference task is NP-hard, and provides complexity resuits for an important class of CLGs, which includes Switching Kalman Filters. Expand

Learning Bayesian Networks: The Combination of Knowledge and Statistical Data

- Computer Science, Mathematics
- Machine Learning
- 2004

A methodology for assessing informative priors needed for learning Bayesian networks from a combination of prior knowledge and statistical data is developed and how to compute the relative posterior probabilities of network structures given data is shown. Expand