Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Diss / 8

.pdf
Скачиваний:
43
Добавлен:
27.03.2016
Размер:
1.47 Mб
Скачать

HONG AND SCHONFELD: MAXIMUM-ENTROPY EXPECTATION-MAXIMIZATION ALGORITHM

907

The gradient of is given by

(62)

APPENDIX C

CONCAVE FUNCTION INEQUALITY

Let us consider a monotonically decreasing and concave function . Since is same monotonic decreasing function as , maximizing is equivalent to maximizing . Therefore,

Thus, the entropy term can be rewritten as

The argument of the function

has finite range

since

,

 

for and

for

. The function

satisfies

 

 

 

(63)

within the range

 

, if

and

. It can be easily shown that the function is convex. Therefore

Meanwhile, the function is concave. Therefore

for . Finally, if and , then the conditions required for (63) are satisfied.

REFERENCES

[1]E. Parzen, “On estimation of a probability denstiy function and mode,” Ann. Math. Statist., vol. 33, pp. 1065–1076, 1962.

[2]M. Girolami and C. He, “Probability density estimation from optimally condensed data samples,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 10, pp. 1253–1264, Oct. 2003.

[3]A. Izenmann, “Recent developments in nonparametric density estimation,” J. Amer. Statistic. Assoc., vol. 86, pp. 205–224, 1991.

[4]D. W. Scott and W. F. Szewczyk, “From kernels to mixtures,” Technometrics, vol. 43, pp. 323–335, Aug. 2001.

[5]S. Mukherjee and V. Vapnik, Support Vector Method for Multivariate Density Estimation. Cambridge, MA: MIT Press, 2000.

[6]N. Balakrishnan and D. Schonfeld, “A maximum entropy kernel density estimator with applications to function interpolation and texture segmentation,” presented at the SPIE Conf. Computational Imaging IV, San Jose, CA, 2006.

[7]A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the em algorithm,” J. Roy. Statist. Assoc., vol. 39, pp. 1–38, 1977.

[8]C. L. Byrne, “Iterative image reconstruction algorithms based on crossentropyminimization,” IEEE Trans. Image Process., vol. 2, no. 1, pp. 96–103, Jan. 1993.

[9]S. Kullback and R. A. Leibler, “On information and sufficiency,” Ann. Math. Statist., vol. 22, pp. 79–86, Mar. 1951.

[10]A. P. Benavent, F. E. Ruiz, and J. M. S. Martinez, “Ebem: An entropybased em algorithm for gaussian mixture models,” in Proc. 18th Int. Conf. Pattern Recognition, 2006, vol. 2, pp. 451–455.

[11]K. Torkkola, “Feature extraction by non-parametric mutual information maximization,” J. Mach. Learn. Res., vol. 3, pp. 1415–1438.

[12]I. T. Nabney, Netlab, Algorithms for Pattern Recognition. New York: Springer, 2004.

[13]R. J. Vanderbei, Linear Programming: Foundation and Extensions, 2nd ed. Boston, MA: Kluwer, 2001.

[14]J. N. Kapur and H. K. Kesavan, Entropy Optimization With Applications. San Diego, CA: Academic, 1992.

[15]T. K. Moon and W. C. Stirling, Mathematical Methods and Algorithms for Signal Processing. Upper Saddle River, NJ: Prentice-Hall, 1999.

[16]C. Bishop, Neural Networks for Pattern Recogntion. Oxford, U.K.: Oxford Univ. Press, 1995.

[17]R. Neal and G. Hinton, “A view of the em algorithm that justifies incremental, sparse, and other variants,” in Learning in Graphical Models, M. I. Jordan, Ed. Norwell, MA: Kluwer, 1998.

[18]K. Levenberg, “A method for the solution of certain non-linear problems in least squares,” Quart. Appl. Math., vol. 2, pp. 164–168, Jul. 1994.

[19]D. W. Marquardt, “An algorithm for the least-squares estimation of nonlinear parameters,” SIAM J. Appl. Math., vol. 11, pp. 431–441, Jun. 1963.

[20]D. Comaniciu and P. Meer, “Mean shift: A robust approach toward feature space analysis,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, no. 5, pp. 603–619, May 2002.

[21]Y. Lin and H. K. Kesavan, “Minimum entropy and information measure,” IEEE Trans. Syst., Man., Cybern. C, Cybern., vol. 28, no. 5, pp. 488–491, Aug. 1998.

Hunsop Hong (S’08) received the B.S. and M.S. degrees in electronic engineering from Yonsei University, Seoul, Korea, in 2000 and 2002, respectively. He is currently pursuing the Ph.D. degree at the Department of Electrical and Computer Engineering, University of Illinois at Chicago.

He was a Research Engineer at the Electronics and Telecommunications Research Institute (ETRI), Daejeon, Korea, until 2003. His research interests include image processing and density estimation.

Dan Schonfeld (M’90–SM’05) was born in Westchester, PA, in 1964. He received the B.S. degree in electrical engineering and computer science from the University of California, Berkeley, and the M.S. and Ph.D. degrees in electrical and computer engineering from the Johns Hopkins University, Baltimore, MD, in 1986, 1988, and 1990, respectively.

In 1990, he joined the University of Illinois at Chicago, where he is currently an Associate Professor in the Department of Electrical and Computer Engineering. He has authored over 100 technical

papers in various journals and conferences. His current research interests are in signal, image, and video processing; video communications; video retrieval; video networks; image analysis and computer vision; pattern recognition; and genomic signal processing.

Dr. Schonfeld was coauthor of a paper that won the Best Student Paper Award in Visual Communication and Image Processing 2006. He was also coauthor of a paper that was a finalist in the Best Student Paper Award in Image and Video Communication and Processing 2005. He has served as an Associate Editor of the IEEE TRANSACTIONS ON IMAGE PROCESSING (Nonlinear Filtering) as well as an Associate Editor of the IEEE TRANSACTIONS ON SIGNAL PROCESSING (Multidimensional Signal Processing and Multimedia Signal Processing).

Соседние файлы в папке Diss