|
HMM hidden markov model - for learning time dependend structures
|
A=HMM(H) returns a hidden markov model object. Its based on code from a
lecture of Sam Roweis.
The data is assumed to be cell arrays or a matrix with positive numbers
as entries and zeros, where there is no sequence entry.
Example:
These representations are equivalent:
Y{1} = [1,2,3,1,4,1]
Y{2} = [1,1,1,3]
and
Y = [1,2,3,1,4,1;1,1,1,3,0,0]
The testing methode computes the most probable sequence of states for
given symbol sequence and its loglikelihood.
Hyperparameters, and their defaults
states = 2 -- number of states
alphabet = [] -- indices which refer to cell array components, that are
the charakters of the alphabet.
The training sequnces have to be vectors of indices.
tol = 1e-5 -- convergence criterion. fractional change of loglokelihood,
which stops the iteration
A = 2 -- number of states or the transition probabilities
updateflags = [1,1,1] -- controls the update of parameters
it is a three-vector whose elements
control the updating of [A,pi,B]
nonzero elements mean update that parameter
maxiter = 100 -- maximal number of iterations for EM-algorithm
Model
alpha.Y -- observation emission probabilities
alpha.X -- state transition probabilities
pi -- initial state prior probabilities
LL -- log likelihood curve
M -- number of symbols
Methods:
train, test
|
Reference : A tutorial on Hidden Markov Models and Selected Applications in Speech Recognition |
|
Author : Lawrence R. Rabiner |