Entropy 2005, 7[1], 15-37
Entropy
ISSN 1099-4300
http://www.mdpi.org/entropy/

The entropy of a mixture of probability distributions

Alexis De Vos

Imec v.z.w., Vakgroep voor elektronika en informatiesystemen,
Universiteit Gent, Sint Pieters-nieuwstraat 41, B-9000 Gent, Belgium
E-mail: [email protected]

Received: 13 September 2004 / Accepted: 20 January 2005 / Published: 20 January 2005

Abstract: If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\frac{1}{m+1} + \frac{1}{m+2} + ... + \frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation.

Keywords: probability distribution; mixture distribution; Bhattacharyya space; Hilbert space.

PACS codes: 02.50


Full text in PDF form, 338 K
© 2005 by MDPI (http://www.mdpi.org). Reproduction for noncommercial purposes permitted.