YorkSpace has migrated to a new version of its software. Access our Help Resources to learn how to use the refreshed site. Contact diginit@yorku.ca if you have any questions about the migration.
 

Derivation of Mixture Distribution and Weighted Likelihood as minimizers of KL-divergence subject to constraints

Loading...
Thumbnail Image

Date

2005

Authors

Wang, Xiaogang
Zidek, James V.

Journal Title

Journal ISSN

Volume Title

Publisher

Annals - Institute of Statistical Mathematics

Abstract

In this article, mixture distributions and weighted likelihoods are derived within an information-theoretic framework and shown to be closely related. This surprising relationship obtains in spite of the arithmetic form of the former and the geometric form of the latter. Mixture distributions are shown to be optima that minimize the entropy loss under certain constraints. The same framework implies the weighted likelihood when the distributions in the mixture are unknown and information from independent samples generated by them have to be used instead. Thus the likelihood weights trade bias for precision and yield inferential procedures such as estimates that can be more reliable than their classical counterparts.

Description

Keywords

Euler-Lagrange equations, relative entropy, mixture distributions, weighted likelihood

Citation

Wang, X. and Zidek, J.V. (2005). Derivation of Mixture Distribution and Weighted Likelihood as minimizers of KL-divergence subject to constraints. The Annals of the Institute of Statistical Mathematics. Vol 57, 687-701.