Title page for ETD etd-12212004-191037

Type of Document Dissertation
Author Moscovich, Luis G.
Author's Email Address lmoscov@lsu.edu
URN etd-12212004-191037
Title Learning Discrete Hidden Markov Models from State Distribution Vectors
Degree Doctor of Philosophy (Ph.D.)
Department Computer Science
Advisory Committee
Advisor Name Title
Jianhua Chen Committee Chair
Daniel C. Cohen Committee Member
Donald H. Kraft Committee Member
John M. Tyler Committee Member
Sukhamay Kundu Committee Member
Ahmed A. El-Amawy Dean's Representative
  • hidden Markov models
  • computational learning theory
  • machine learning
Date of Defense 2004-12-03
Availability unrestricted
Hidden Markov Models (HMMs) are probabilistic models that have been widely applied to a number of fields since their inception in the late 1960’s. Computational Biology, Image Processing, and Signal Processing, are but a few of the application areas of HMMs.

In this dissertation, we develop several new efficient learning algorithms for learning HMM parameters.

First, we propose a new polynomial-time algorithm for supervised learning of the parameters of a first order HMM from a state probability distribution (SD) oracle.

The SD oracle provides the learner with the state distribution vector corresponding to a query string. We prove the correctness of the algorithm and establish the conditions under which it is guaranteed to construct a model that exactly matches the oracle’s target HMM. We also conduct a simulation experiment to test the viability of the algorithm. Furthermore, the SD oracle is proven to be necessary for polynomial-time learning in the sense that the consistency problem for HMMs, where a training set of state distribution vectors such as those provided by the SD oracle is used but without the ability to query on arbitrary strings, is NP-complete.

Next, we define helpful distributions on an instance set of strings for which polynomial-time HMM learning from state distribution vectors is feasible in the absence of an SD oracle and propose a new PAC-learning algorithm under helpful distribution for HMM parameters. The PAC-learning algorithm ensures with high probability that HMM parameters can be learned from training examples without asking queries.

Furthermore, we propose a hybrid learning algorithm for approximating HMM parameters from a dataset composed of strings and their corresponding state distribution vectors, and provide supporting experimental data, which indicates our hybrid algorithm produces more accurate approximations than the existing method.

  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  Moscovich_dis.pdf 504.74 Kb 00:02:20 00:01:12 00:01:03 00:00:31 00:00:02

Browse All Available ETDs by ( Author | Department )

If you have more questions or technical problems, please Contact LSU-ETD Support.