We propose a hierarchical EM algorithm for simplifying a finite mixture model into a reduced mixture model with fewer mixture components. The reduced model is obtained by maximizing a variational lower bound of the expected log-likelihood of a set of virtual samples. We develop three applications for our mixture simplification algorithm: recursive Bayesian filtering using Gaussian mixture model posteriors, KDE mixture reduction, and belief propagation without sampling. For recursive Bayesian filtering, we propose an efficient algorithm for approximating an arbitrary likelihood function as a sum of scaled Gaussian. Experiments on synthetic data, human location modeling, visual tracking, and vehicle self-localization show that our algorithm can be widely used for probabilistic data analysis, and is more accurate than other mixture simplification methods.
We next present an application of the Hierarchical EM algorithm to analysing eye movement data in psychological research, i.e., Eye Movement analysis with Hidden Markov Models (EMHMM). This approach provides data-driven, quantitative measures of eye movement pattern similarity based on both temporal and spatial dimensions of eye movements. More specifically, each individual’s eye movement pattern is summarized using a hidden Markov model (HMM), including person-specific regions of interest (ROIs) and transition probabilities among these ROIs. Individual HMMs are clustered to discover common patterns. Differences among individual patterns are quantified through similarity measures with the common patterns. We will first summarize how we applied EMHMM to face recognition research and made discoveries thus far not revealed by other methods. We will then introduce new methodologies for tasks involving cognitive state changes and stimuli with different feature layouts.