This paper presents a class of statistical models that integrate two statistical modeling paradigms in the literature: (I) Descriptive methods, such as Markov random fields and minimax entropy learning (Zhu, S.C., Wu, Y.N., and Mumford, D. 1997. Neural Computation, 9(8)), and (II) Generative methods, such as principal component analysis, independent component analysis (Bell, A.J. and Sejnowski, T.J. 1997. Vision Research, 37:3327-3338), transformed component analysis (Frey, B. and Jojic, N. 1999. ICCV), wavelet coding (Mallat, S. and Zhang, Z. 1993. IEEE Trans. on Signal Processing, 41:3397-3415; Chen, S., Donoho, D., and Saunders, M.A. 1999. Journal on Scientific Computing, 20(1):33-61), and sparse coding (Olshausen, B.A. and Field, D.J. 1996. Nature, 381:607-609; Lewicki, M.S. and Olshausen, B.A. 1999. JOSA, A. 16(7): 1587-1601). In this paper, we demonstrate the integrated framework by constructing a class of hierarchical models for texton patterns (the term "texton" was coined by psychologist Julesz in the early 80s). At the bottom level of the model, we assume that an observed texture image is generated by multiple hidden "texton maps", and textons on each map are translated, scaled, stretched, and oriented versions of a window function, like mini-templates or wavelet bases. The texton maps generate the observed image by occlusion or linear superposition. This bottom level of the model is generative in nature. At the top level of the model, the spatial arrangements of the textons in the texton maps are characterized by minimax entropy principle, which leads to embellished versions of Gibbs point process models (Stoyan, D., Kendall, W.S., and Mecke, J. 1985. Stochastic Geometry and its Applications). The top level of the model is descriptive in nature. We demonstrate the integrated model by a set of experiments.
Read full abstract