Continuous-time Hidden Markov Models (CHMMs) have been used in various technical areas, such as network performance evaluation, risk assessment, flash memory workload, and many other settings. CHMMs (having a finite number of states and events, but where the states and/or events can change at any time) are not to be confused with Continuous-state Hidden Markov Models (where there is a continuum of states) or Continuous-Events Hidden Markov Models (where there is a continuum of events).
Baum-Welch (BW) training is an expectation-maximization learning algorithm for discrete Hidden Markov Models. Given a set of events, Baum-Welch training may be used to find a Hidden Markov Model that may local maximize the likelihood that the events come from the Hidden Markov Model. No tractable (polynomial-time) algorithm is known for the global expectation-maximization problem.
Also associated with Hidden Markov models is the Baldi-Chauvin algorithm, an online smoothing expectation-maximizing algorithm, and Viterbi's method, which may find the most likely hidden states given a Hidden Markov Model and events that arise from it.