Entropy 2001, 3[1], 1-11
Entropy
ISSN 1099-4300
http://www.mdpi.org/entropy/

Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

Jonathan D.H. Smith

Department of Mathematics, Iowa State University, Ames, IA 50011, USA.
E-mail: [email protected]
URL: http://www.math.iastate.edu/jdhsmith/

Received: 15 February 2000 / Accepted: 11 January 2001 / Published: 1 February 2001

Abstract: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.

Keywords: information-theoretic entropy; Shannon entropy; Martin-Loef randomness; self-delimiting algorithmic complexity; thermodynamic entropy; second law of thermodynamics; selection principle; wave equation; Gibbs paradox; dimensional analysis; Kullback entropy; cross-entropy; reference density; improper prior.


Full text in PDF form, 133 K
© 2001 by MDPI (http://www.mdpi.org). Reproduction for noncommercial purposes permitted.