The paper introduces the Hierarchical Hidden Markov Model (HHMM), a generalization of Hidden Markov Models (HMMs) that captures complex multi-scale structures in natural sequences, such as language, handwriting, and speech. The authors motivate the development of HHMMs by the need to model structures at different length scales and the recursive nature of these sequences. They extend the standard Baum-Welch algorithm to estimate HHMM parameters from unlabeled data and demonstrate its application in automatic hierarchical parsing of observation sequences. Two main applications are discussed: constructing hierarchical models of natural English text and identifying repeated strokes in cursive handwriting. The paper concludes by highlighting the potential of HHMMs in modeling complex sequences and their ability to handle statistical inhomogeneities.The paper introduces the Hierarchical Hidden Markov Model (HHMM), a generalization of Hidden Markov Models (HMMs) that captures complex multi-scale structures in natural sequences, such as language, handwriting, and speech. The authors motivate the development of HHMMs by the need to model structures at different length scales and the recursive nature of these sequences. They extend the standard Baum-Welch algorithm to estimate HHMM parameters from unlabeled data and demonstrate its application in automatic hierarchical parsing of observation sequences. Two main applications are discussed: constructing hierarchical models of natural English text and identifying repeated strokes in cursive handwriting. The paper concludes by highlighting the potential of HHMMs in modeling complex sequences and their ability to handle statistical inhomogeneities.