Haiping Li, Fangxin Chen, et al.
ICASSP 2003
This paper presents a new probability table compression method based on mixture models applied to N-tuple recognizers. Joint probability tables are modeled by lower dimensional probability mixtures and their mixture coefficients. The maximum likelihood parameters of the mixture models are trained by the Expectation-Maximization (EM) algorithm and quantized to one byte integers. The probability elements which mixture models do not estimate reliably are kept separately. Experimental results with on-line handwritten UNIPEN digits show that the total memory size of an N-tuple recognizer is reduced from 11.8M bytes to 0.55M bytes, while the recognition rate drops from 97.7% to 97.5%.
Haiping Li, Fangxin Chen, et al.
ICASSP 2003
Juan M. Huerta, David Lubensky
ICASSP 2003
G. Iyengar, H.J. Nock, et al.
ICASSP 2003
Fabrizio Petrini, Gordon Fossum, et al.
IPDPS 2007