Entropy 2002, 4[1], 32-34
Entropy
ISSN 1099-4300
http://www.mdpi.org/entropy/

Communication

Comments on "A Law of Information Growth"

Richard L. Coren

Emeritus Professor, Electrical and Computer Engineering Department, Drexel University, Philadelphia, PA 19104, USA.
Tel: 215-646-1024, FAX: 215-646-5523, E-mail: [email protected]

Received: 11 January 2002 / Accepted: 11 January 2002 / Published: 30 January 2002

Abstract: Notes added in proof of an earlier paper on an empirical "law" of information growth in evolution, and in response to questions raised therein.

Keywords: information growth; system stability, evolution.


Full text in PDF form, 12 K

A recent paper [1] in this journal presented evidence for a systematic growth of information in the history of the cosmos, of the life on earth, and of the technology developed there. This evidence is empirical, guided by a cybernetic model of evolution that has been successfully applied to several other physical and sociological phenomena [2]. In this case the scale of the data offered is so great, and because the relation between information and entropy is so close [3], the question was raised of the relation being an unrecognized aspect of the second law of thermodynamics. In particular it was asked "whether a theoretical foundation can be found for this phenomenology...."

That paper also points out that:

"From the Second Law of Thermodynamics, we know that the increase of entropy is a consequence of the dynamics of system change, not their source, although we often use it as such. However, 'Processes that generate order are in no sense driven by the growth of entropy [4].' The relation between entropy and information means that this is true of the latter as well as the former." This implies that indirect mechanisms must be involved in the information development described.

In this regard, Schneider and Kay [5] discuss the behavior of systems in dynamic equilibrium under a set of external fluxes and forces. When these influences change, the systems react in ways to resist being moved from their equilibrium. If a threshold is crossed they may develop a new state to oppose further movement from equilibrium. A common example is the Benard cell, a container of fluid heated from below. Heat rises through the liquid by conduction, i.e., molecular collisions. If the temperature gradient and heat flux increase too much the cell suddenly displays an emergent, coherent organization. This involves a switch to heat flow by convection, where columns of liquid move up, carrying the heat, and then, cooled, return to the base. It is a more ordered state, with lower configurational entropy, but the greater heat flow entails an entropy increase that more than compensates.

Consider an even simpler system, consisting of a gas of hydrogen and oxygen molecules, that can be completely described by stating that there are twice as many hydrogen (H) atoms as oxygen (O) atoms and that each atom occupies an equivalent volume. This is the greatest entropy state of the system since it has uniformly distributed components. If the gas is cooled these atoms will combine to form water molecules (H-O-H). A new description must include information about the chemical bonds, e.g., their angles, how the molecules rotate about those bonds, and how the molecules twist and vibrate when they interact. Further cooling produces ice, where the molecules become more firmly interconnected. Now the behavior is different and more complex. Its description must include the crystalline structure with its fluctuations and imperfections, the crystalline and molecular electronic states, the phonon spectrum, etc.

Each change resulted in a more ordered state with lower entropy than its predecessor. All the changes liberated a heat of fusion so that the net entropy of the local universe increased. In addition, each successive condition involved more information to formulate its state of being, as the internal entropy reduced. This information is the difference between the maximum entropy and that of the ordered state [3]. The same conclusion can be drawn for the ordered states of formation of the chemical elements after the Big Bang, of the stars and galaxies, of the Benard cell, and of the living cell. We expect, therefore, that the totality of changes of ordered systems under the influence of changing environmental forces is toward greater order and information.

Schneider and Kay extend these considerations to speciation: "When a new living system is generated after the demise of an earlier one, it would make the self-organization process more efficient if it were constrained to variations which have a high probability of success. Genes play this role in constraining the self-organization process to those options which have a high probability of success.... This is the role of the gene and, at a larger scale, biodiversity: to act as information databases of self-organization strategies that work."

Once genes are involved the nature of order/information change can take different forms, leading to vast complexification of structure and, ultimately, of a neural net and brain, as illustrated by those events on the information trajectory of reference [1]. For example, there seems to have been an impetus for the evolutionary development of information communication. First this was through the appearance of coherent speech, even at the cost of some physical disadvantage [6], and then followed by various inventive technologies.

We are the conscious originators of the last, technological, informational inventions on the information trajectory of reference [1]. We can therefore appreciate the motivation they represent for information expansion, even as we inquire about its connection to the law of entropy.

References

[1] Coren, R. L. Empirical Evidence for a Law of Information Growth. Entropy 2001, 3, 259-273. < http://www.mdpi.org/entropy/papers/e3040259.pdf>
[2] Coren, R. L. The Evolutionary Trajectory: The Growth of Information in the History and Future of Earth (Gordon and Breach Publishers, 1998).
[3] Machta, J. Entropy, Information, and Computation. Am. J. Phys. 1999, 67 (12), 1074-1077.
[4] Layzer, D. in Entropy, Information, and Evolution; edited by B.H. Weber and D.J. Depew (MIT Press, 1990) p.23.
[5] Schneider, E. D.; Kay, J. J. in What is Life? The Next Fifty Years, edited by M. P. Murphy and L. O'Neill (Cambridge Univ. Press, 1995), 161.
[6] (a) Lieberman, P. The Biology and Evolution of language (Harvard Univ. Press, 1984);
(b) Lieberman, P. On Human Speech, Syntax, and language. Human Evolution 1988, 3, 3-18.


© 2002 by MDPI (http://www.mdpi.org). Reproduction for noncommercial purposes permitted.