
Type of Document Dissertation Author Moscovich, Luis G. Author's Email Address lmoscov@lsu.edu URN etd12212004191037 Title Learning Discrete Hidden Markov Models from State Distribution Vectors Degree Doctor of Philosophy (Ph.D.) Department Computer Science Advisory Committee
Advisor Name Title Jianhua Chen Committee Chair Daniel C. Cohen Committee Member Donald H. Kraft Committee Member John M. Tyler Committee Member Sukhamay Kundu Committee Member Ahmed A. ElAmawy Dean's Representative Keywords
 hidden Markov models
 computational learning theory
 machine learning
Date of Defense 20041203 Availability unrestricted Abstract Hidden Markov Models (HMMs) are probabilistic models that have been widely applied to a number of fields since their inception in the late 1960’s. Computational Biology, Image Processing, and Signal Processing, are but a few of the application areas of HMMs.In this dissertation, we develop several new efficient learning algorithms for learning HMM parameters.
First, we propose a new polynomialtime algorithm for supervised learning of the parameters of a first order HMM from a state probability distribution (SD) oracle.
The SD oracle provides the learner with the state distribution vector corresponding to a query string. We prove the correctness of the algorithm and establish the conditions under which it is guaranteed to construct a model that exactly matches the oracle’s target HMM. We also conduct a simulation experiment to test the viability of the algorithm. Furthermore, the SD oracle is proven to be necessary for polynomialtime learning in the sense that the consistency problem for HMMs, where a training set of state distribution vectors such as those provided by the SD oracle is used but without the ability to query on arbitrary strings, is NPcomplete.
Next, we define helpful distributions on an instance set of strings for which polynomialtime HMM learning from state distribution vectors is feasible in the absence of an SD oracle and propose a new PAClearning algorithm under helpful distribution for HMM parameters. The PAClearning algorithm ensures with high probability that HMM parameters can be learned from training examples without asking queries.
Furthermore, we propose a hybrid learning algorithm for approximating HMM parameters from a dataset composed of strings and their corresponding state distribution vectors, and provide supporting experimental data, which indicates our hybrid algorithm produces more accurate approximations than the existing method.
Files
Filename Size Approximate Download Time (Hours:Minutes:Seconds)
28.8 Modem 56K Modem ISDN (64 Kb) ISDN (128 Kb) Higherspeed Access Moscovich_dis.pdf 504.74 Kb 00:02:20 00:01:12 00:01:03 00:00:31 00:00:02