没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
Dynamic Bayesian Networks:
Representation, Inference and Learning
by
Kevin Patrick Murphy
B.A. Hon. (Cambridge University) 1992
M.S. (University of Pennsylvania) 1994
A dissertation submitted in partial satisfaction
of the requirements for the degree of
Doctor of Philosophy
in
Computer Science
in the
GRADUATE DIVISION
of the
UNIVERSITY OF CALIFORNIA, BERKELEY
Committee in charge:
Professor Stuart Russell, Chair
Professor Michael Jordan
Professor Peter Bickel
Professor Jeffrey Bilmes
Fall 2002
The dissertation of Kevin Patrick Murphy is approved:
Chair
Date
Date
Date
Date
University of California, Berkeley
Fall 2002
Dynamic Bayesian Networks:
Representation, Inference and Learning
Copyright 2002
by
Kevin Patrick Murphy
ABSTRACT
Dynamic Bayesian Networks:
Representation, Inference and Learning
by
Kevin Patrick Murphy
Doctor of Philosophy in Computer Science
University of California, Berkeley
Professor Stuart Russell, Chair
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models
(HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For
example, HMMs have been used for speech recognition and bio-sequence analysis, and KFMs have been
used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs
by allowing the state space to be represented in factored form, instead of as a single discrete random variable.
DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linear-Gaussian.
In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact
and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T ) time instead of O(T
3
), where
T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T);
a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds
on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored
frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying Rao-Blackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety
of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation
of the field of sequential data modelling.
1
DEDICATION
To my parents
for letting me pursue my dream
for so long
so far away from home
&
To my wife
for giving me
new dreams to pursue
i
剩余224页未读,继续阅读
资源评论
- guoxze2014-07-18还可以,对学习贝叶斯网络有些帮助
- chengyuehuahua2016-04-28挺好的,有用处,值得好好读一下
dmgf
- 粉丝: 0
- 资源: 3
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功