International Journal of
**Molecular Sciences**
**ISSN 1422-0067**
**© 2001 by MDPI**
**www.mdpi.org/ijms/**

**The Nature of the Chemical Process.
1. Symmetry Evolution –Revised Information Theory, Similarity Principle
and Ugly Symmetry**

**Shu-Kun Lin**

Molecular Diversity Preservation International (MDPI), Sangergasse 25, Basel CH-4054 Switzerland. Tel. ++41 79 322 3379, Fax: ++41 61 302 8918, E-mail: lin@mdpi.org, http://www.mdpi.org/lin

*Received: 16 December 1999 / Accepted: 15 March 2001 / Published:
25 March 2001*

**Key words**: continuous symmetry,** **static entropy, similarity
principle, symmetry principle, the second law of information theory, structural
stability, complementarity, order, disorder

Symmetry has been mainly regarded as a mathematical attribute [1-3]. The Curie-Rosen symmetry principle [2] ] is a higher symmetry- higher stability relation that has been seldom, if ever, accepted for consideration of structural stability and process spontaneity (or process irreversibility). Most people accept the higher symmetry- lower entropy relation because entropy is a degree of disorder and symmetry has been erroneously regarded as order [4]. To prove the symmetry principle, it is necessary to revise information theory where the second law of thermodynamics is a special case of the second law of information theory.

Many authors realized that, to investigate the processes involving molecular self-organization and molecular recognition in chemistry and molecular biology and to make a breakthrough in solving the outstanding problems in physics involving critical phenomena and spontaneity of symmetry breaking process [5], it is necessary to consider information and its conversion, in addition to material, energy and their conversions [6]. It is also of direct significance to substantially modify information theory, where three laws of information theory will be given and the similarity principle (entropy increases monotonically with the similarity of the concerned property among the components (Figure 1) [7]) will be proved.

**2. Definitions**

*2.1. Symmetry and Nonsymmetry*

Symmetry as a Greek word means *same measure *[1]. In other words,
it is a measure of indistinguishability. A number *w _{S}*
can be used to denote the measure of the indistinguishability and can be
called the symmetry number [7d]. In some cases it is the number of invariant
transformations. Clausius proposed to name the quantity

Only perfect symmetry can be described mathematically by group theory [2]. It is a special case of imperfect symmetry or continuous symmetry, which can be measured by similarity, instead of indistinguishability [7d].

**Figure 2.** Our earth is not of perfect symmetry.

On the contrary, we define nonsymmetry as a measure of difference or distinguishability. Similarity can be defined as a continuous index of imperfect symmetry between the two limits: distinguishability (the lowest similarity) or nonsymmetry and indistinguishability (the highest similarity) or symmetry.

*2.2. Entropy and Information*

In statistical mechanics, entropy is a function
of the distribution of the energy levels. The entropy concept in information
theory is much more general. However, information theory, which has been
used mainly in communication and computer science [10,11], is about the
*process* of the communication channel. We will revise it to become
a theory regarding the *structure* of a considered system. Process
will be considered as a series of structures. Instead of defining *ensemble*
(See: section 2.1, ref. [10]), we directly define that *the macroscopic
structure *with regard to a certain kind of property *X* of a considered
system as a triple (*x*, *M _{X}*,

- A typical microstate as a freeze-frame of an array of spins undergoing up-and-down tumbling.
- The mixing of all 729 microstates.

(b) The mixture of all *w* microstates.

**Figure. 3.** Dynamic motion of spin-up and
spin-down binary system. (a) A microstate. (b) The mixing of all microstates
defines a symmetric macroscopic state.

(a) A microstate.

(1)

with the understanding that
and . Because
and , entropy is

non-negative ().
If the *w* microstates have the *same* value of the considered
property, hence the *same* value of ,
and

(2)

The maximal entropy is denoted as *L*, because

(3)

(This is the Gibbs inequality, see [7d] and the
relevant citations). Entropy is a logarithmic function of ,
the (apparent) symmetry number (or the apparent number of microstates of
indistinguishable property, or the apparent number of the *equivalent*
microstate which are of the *same* value of the considered property
[7d]):

(4)

Equation 4 is the familiar Boltzmann entropy expression. Combination of equations 1 and 4 leads to

(5)

Obviously

(6)

(7)

*L* is mnemonic for Logarithmic function
or the "Largest possible value" of either the logarithmic function *S*
(equation 2) or *I*:

( 8)

Then, entropy is expressed as information loss [12]

(9)

or in certain cases when the absolute values are unknown,

(10)

for a change between two structures.

(11)

can be given, and

(12)

where is called the nonsymmetry number - the number of microstates of distinguishable property whereas in equation 4 called symmetry number (vide infra, section 6).

The logarithmic functions *S*, *I* and
*L* are all nonnegative in value, dimensionless and they are all macroscopic
properties.

From equation 9 and the fact that in practice
information can be recorded only in a static structure, we may define static
entropy [7d]. A macroscopic static structure is described by a set of *w*
microstates which are the *w* possible rearrangements or *w*
possible transformations (recall the operations in group theory treatment
of static structure which can be either an individual molecule or an assemblage)
[2,8]).

In the entropy and information expressions, the
unit is called nat (natural logarithmic unit). For a binary system, the
unit is bit. Normally there is a positive constant in these expressions
(e.g., the Boltzmann constant *k*_{B} in thermodynamics).
Here we put the constant as 1. In thermodynamics, we may denote the traditionally
defined thermodynamics entropy as

(13)

and

(14)

where *E* is the total energy and* F*
the Helmholtz potential.

*2.3. Labeling and Formatting*

In this paper similarity and its two limits (distinguishability
and indistinguishability) will be frequently considered. Therefore, similarity
and its significance should be clearly defined. Entropy or information
(symmetry or nonsymmetry, vide infra) can be defined regarding a property
*X*. Suppose there are *n* kinds of property *X*, *Y*,
*Z*, ..., etc. which are independent. In order to enumerate the number
of possible microstates regarding *X*, particularly in the cases of
symmetry study, where we may encounter the real problem of indistinguishability,
labeling with some of the other *n*-1 kinds of property *Y*,
*Z*, ..., etc. is necessary. Microscopically the components in the
considered system may appear as several kinds of property *X*, *Y*,
*Z*, ..., etc. If only one kind of property is detected by the instrument
(e.g., a chemical sensor to detect the existence of certain gas molecule)
or by a machine (e.g., a heat engine which detects the average kinetic
energy of the gas molecules [6c]), the others can be used for labeling
purposes to distinguish the otherwise indistinguishable individual components.
We have the following postulate (or definition):

*Example 5.* Suppose information is recorded
by a set of individual items, for example, the symbols 0 and 1 in a binary
system used in computer science. The color or font size or the position
index 1234567 does not influence the indistinguishability of a string of
seven zeros: 0000000
or 0000000
or 1234567. Therefore these properties can be used for labeling.

*Example 6.* DNA is the genetic substance
due to the distinguishability of the four organic bases (two pyrimidines
- cytosine and thymine, and two purines -
adenine and guanine). This kind of distinguishability does not influence
the symmetric backbone structure (the periodicity of the sugar-phosphate
backbone) along the axis that can be detected by X-ray diffraction [13].
Therefore the different organic bases can be used for labeling.

*Example 7.* The information retrieval problem.
The distinguishable ID number or phone number belonging to an individual
person does not influence the similarity of characters of the individuals
or the indistinguishability of family names or given names.

*Example 8.* For an *ideal* gas model
in thermodynamics, all the monatomic ideal gases are the same (e.g., He
and Ar). All the diatomic ideal gases are also the same. The difference
(e.g., nitrogen N_{2} and oxygen O_{2} gases, or the ordinary
water H_{2}O and deuterated water D_{2}O) can be regarded
solely as labeling which does not influence the indistinguishability of
the gases the heat engine experiences [6c].

*Theorem: To a heat engine, all the ideal gases
are indistinguishable.*

All ideal gases have the following state equation as detected in a heat engine:

(16)

where *P* is the pressure and *T* the
temperature. The definition of labeling is very important for a final resolution
of the Gibbs paradox of entropy of mixing (see section 8.1.2).

*3. The Three Laws and the Stability Criteria*

Parallel to the first and the second laws of thermodynamics, we have:

*The second law of information theory:* Information
*I* of an isolated system decreases to a minimum at equilibrium.

universe = system + surroundings (17)

and treat the universe formally as an isolated
system (Actually we have always done this in thermodynamics following Clausius
[9]). Then, these two laws are expressed as the following: *The function
L of the universe is a constant. The entropy S of the universe tends toward
a maximum. *Therefore, the second law of information theory can be used
as the criteria of structural stability and process spontaneity (or process
irreversibility) in all cases, whether they are isolated systems or not.
If the entropy of system + surroundings increases from structure A to structure
B, B is more stable than A. The higher the value
for the final structure, the more spontaneous (or more irreversible) the
process will be. For an isolated system the surroundings remain unchanged.

At this point we may recall the definition of equilibrium. Equilibrium means the state of indistinguishability (a symmetry, the highest similarity) [7d], e.g., as shown in figure 5, the thermal equilibrium between parts A and B means that their temperatures are the same (See also the so-called zeroth law of thermodynamics [14]).

We have two reasons to revise information theory.
Firstly, the entropy concept has been confusing in information theory [15]
which can be illustrated by von Neumann’s private communication with Shannon
regarding the terms entropy and information (Shannon told the following
story behind the choice of the term "entropy" in his information theory:
"My greatest concern was what to call it. I thought of calling it ‘uncertainty’.
When I discussed it with John von Neumann, he had a better idea: ‘You should
call it entropy, for two reasons. In the first place your uncertainty function
has been used in statistical mechanics under that name, so it already has
a name. In the second place, and more important, no one knows what entropy
really is, so in a debate you will always have the advantage’ " [15a]).
Many authors use the two concepts entropy and information interchangeably
(see also ref. 7d and citations therein). Therefore a meaningful discussion
on the conversion between the logarithmic functions *S* and *I*
is impossible according to the old information theory. Secondly, to the
present author's knowledge, none of several versions of information theory
has been applied to physics to characterize structural stability. For example,
Jaynes' information theory [16] should have been readily applicable to
chemical physics but only a parameter similar to temperature is discussed
to deduce that at the highest possible value of that temperature-like parameter,
entropy is the maximum. Jaynes' information theory or the so-called maximal
entropy principle has been useful for statistics and data reduction (See
the papers presented at the annual conference MaxEnt). Brillouin's negentropy
concept [17] has also never been used to characterize structural stability.

To be complete, let us add the third law here also:

The second law of thermodynamics might be regarded
as a special case of the second law of information theory because thermodynamics
treats only the energy levels as the considered property *X* and only
the dynamic aspects. In thermodynamics, the symmetry is the energy degeneracy
[18,19]. We can consider spin orientations and molecular orientations or
chirality as relevant properties which can be used to define a static entropy.
What kinds of property besides the energy level and their similarities
are relevant to structural stability? Are they the so-called observables
[18] in physics? Should these properties be related to energy (i.e., the
entropy is related to energy by temperature-like intensive parameters as
differential equations where Jaynes' maximal entropy principle might be
useful) so that a temperature (either positive or negative [7a]) can be
defined? These problems should be addressed very carefully in future studies.

Similar to the laws of thermodynamics, the validity of the three laws of information theory may only be supported by experimental findings. It is worth reminding that the thermodynamic laws are actually postulates because they cannot be mathematically proved.

**4. Similarity Principle and Its Proof**

Traditionally symmetry is regarded as a discrete
or a "yes-or -no" property. According to Gibbs, the properties are either
the same (indistinguishability) or not the same (figure 1a). A continuous
measure of static symmetry has been elegantly discussed by Avnir and coworkers
[20]. Because the maximum similarity is indistinguishability (the sameness)
and the minimum is distinguishability, corresponding to the symmetry and
nonsymmetry, respectively, naturally similarity can be used as a continuous
measure of symmetry (section 2.1). The similarity refers to the considered
property *X* of the components (see the definition of labeling in
section 2.3) which affects the similarity of the probability values of
all the *w* microstates and eventually the value of entropy (equation
1).

Gibbs paradox statement [21] is a higher similarity- lower entropy relation (figure 1a), which has been accepted in almost all the standard textbooks of statistical mechanics and thermodynamics. The resolution of the Gibbs paradox of entropy- similarity relation has been very controversial. Some recent debates are listed in reference [21]. The von Neumann continuous relation of similarity and entropy (the higher the similarity among the components is, the lower value of entropy will be, according to his resolution of Gibbs paradox) is shown in figure 1b [21j]. Because neither Gibbs' discontinuous higher similarity- lower entropy relation (where symmetry is regarded as a discrete or a "yes-or -no" property ) nor von Neumann's continuous relation has been proved, they can be at most regarded as postulates or assumptions. Based on all the observed experimental facts [7d], we must abandon their postulates and accept the following postulate as the most plausible one (figure 1c):

(18)

has been proved in geometry (see the citations
in [7d]). As represents
the maximum similarity among the considered *w* microstates [18],
the general expression of entropy
must increase continuously with the increase in the property similarity
among the *w* microstates. The maximum value of
in equation 2 corresponds to the highest similarity. Finally, based on
the second law of the revised information theory regarding structural stability
that says that *the entropy* *of an isolated system (or system +
environment) either remains unchanged or increases*, the similarity
principle has been proved.

Following the convention of defining similarities [7d,7e] as an index in the range of [0,1], we may simply define

(19)

as a similarity index (figure 1c). As mentioned
above, *Z* can be properly defined in many different expressions and
the relation that entropy increases continuously with the increase in the
similarity will be still valid (However, the relation will not be necessarily
a straight line if the similarity is defined in another way). For example,
the similarities
in a table

(20)

can be used to define a similarity value of a
system of *N* kinds of molecule [7c,7e].

**5. Curie-Rosen Symmetry Principle and Its Proof**

It is straightforward to prove the higher symmetry-
higher stability relation (the Curie-Rosen symmetry principle [2]) as a
special case of the similarity principle. Because higher similarity is
correlated with a higher degree of symmetry, the similarity principle also
implies that entropy can be used to measure the degree of symmetry. We
can conclude: *The higher the symmetry (indistinguishability)* *of
the structure is, the higher the value of entropy will be.* From the
second law of information theory, the higher symmetry-
higher entropy relation is proved.

The proof of the higher symmetry- higher entropy relationship can be performed by contradiction also. Higher symmetry number- lower entropy value relation can be found in many textbooks (see citations in [7d] and p. 596 of [19]) where the existence of a symmetry would result in a decrease in the entropy value:

(21)

where s denotes the
symmetry number and. Let the
entropy change from *S’* to *S* is

(22)

Then the change in symmetry number would be a factor

, (23)

and

(24)

where and are the two symmetry numbers. This leads to an entropy expression . However, because any structure would have a symmetry number , entropy would be a negative value, which contradicts the definition that entropy is always positive. Therefore neither nor are valid. The correct form should be equation 4 (, ). In combination with the second law of information theory, the higher symmetry- higher stability relation (the symmetry principle) is also proved.

Rosen discussed several forms of symmetry principle
[2]. Curie’s causality form of the symmetry principle is that *the effects
are more symmetric than the causes*. The higher symmetry-
higher stability relation has been clearly expressed by Rosen [2]:

Because entropy defined in the revised information theory is more broad, symmetry can include both static (the third law of information theory) and dynamic symmetries. Therefore, we can predict the symmetry evolution from a fluid system to a static system when temperature is gradually reduced: the most symmetric static structure will be preferred (vide infra).

**6. A Comparison: Information Minimization and
Potential Energy Minimization**

A complete structural characterization should
require the evaluation of both the degree of symmetry (or indistinguishability)
and the degree of nonsymmetry (or distinguishability). Based on the above
discussion we can define entropy as the logarithmic function of symmetry
number *w _{S}* in equation 4. Similarly, the number

Other authors briefly discussed the higher symmetry-
higher entropy relation previously (see the citation in [2]). Rosen suggested
that a possible scheme for symmetry quantification is to take for the degree
of symmetry of a system the order of its symmetry group (or its logarithm)
(p.87, reference [2a]). The degree of symmetry of a considered system can
be considered in the following increasingly simplified manner: in the language
of group theory, the group corresponding to *L* is a direct product
of the two groups corresponding to the values of entropy *S* and information
*I*:

(25)

where is the group representing the observed symmetry of the system, the nonsymmetric (distinguishable) part that potentially can become symmetric according to the second law of information, and is the group representing the maximum symmetry. In the language of the numbers of microstates

(26)

where the three numbers are called the maximum symmetry number, the symmetry number and the nonsymmetry numbers, respectively. These numbers of microstate could be the orders of the three groups if they are finite order symmetry groups. However, there are many groups of infinite order such as those of rotational symmetry, which should be considered in detail in our further studies.

The behavior of the logarithmic functions , and their relation

(7)

can be compared with that of the total energy
*E*, kinetic energy
and potential energy which are
the eigenvalues of *H** *and
*K* and *P* in the Hamiltonian
expressed conventionally in either classical mechanics or quantum mechanics
as two parts:

*H* =*
K* +* P*
(27)

The energy conservation law and the energy minimization law regarding a spontaneous (or irreversible) process in physics (In thermodynamics they are the first and the second law of thermodynamics) says that

(28)

and
(e.g., for a linear harmonic oscillator, the sum of the potential energy
and kinetic energy remains unchanged), for
an isolated system (or system +environment). In thermodynamics, Gibbs free
energy *G* or Helmholtz potential *F* (equation 14) are such
kinds of potential energy. It is well known that the minimization of the
Helmholtz potential or Gibbs free energy is an alternative expression of
the second law of thermodynamics. Similarly,

(29)

For a spontaneous process of an isolated system, (the first law of information theory) and (the second law of information theory or the minimization of the degree of nonsymmetry) or (the maximization of the degree of symmetry). For an isolated system,

(30)

For systems that cannot be isolated, spontaneous processes with both or for the systems are possible provided that

(31)

for the universe ( for a reversible process. are impossible process). The maximum symmetry number, the symmetry number and the nonsymmetry number can be calculated as the exponential of the maximum entropy, entropy and information, respectively. These relations will be illustrated by some examples and will be studied in more detail in the future.

The revised information theory provides a new approach to understanding the nature of energy (or energy-matter) conservation and conversion. For example the available potential energy due to distinguishability or nonsymmetry can be calculated. This can be illustrated in a system undergoing spontaneous mass or heat transfer between two parts of a chamber (figure 5). The distinguishability or nonsymmetry is the cause of the phenomena. Many processes are irreversible because a process does not proceed from a symmetric structure to a nonsymmetric structure.

From the second law of information theory, any system with interactions among its components will evolve towards an equilibrium with the minimum information (or maximum symmetry). This general principle leads to the criteria of equilibrium for many kinds of system. For example, in a mechanical system the equilibrium is the balance of forces and the balance of torques.

We may consider symmetry evolution generally in two steps. Step one: bring several components (e.g., parts A and B in figure 5) to the vicinity as individually isolated parts. We may treat this step as if these parts have no interaction. Step two: let these parts interact by removing electromagnetic insulation, thermal insulation, or mechanical barrier, etc. (e.g., the right side of figure 5).

For step one, similarity analysis will be satisfactory. For the example shown in figure 5, we measure if the two parts A and B are of the same temperature (if they are, there will be no heat transfer), the same pressure (if they are, there will be no mass transfer) or the same substances (if they are, there will be no chemical reactions.

We may calculate the total logarithmic function
*L*_{total}, total entropy ()
and total information of an isolated system of many parts for the first
step. In the same way, we may calculate a system of complicated, hierarchical
structures. Suppose there are *M* hierarchical levels (,
e.g., one level is a galaxy, the other levels are the solar system, a planet,
a box of gas, a molecule, an atom, electronic motion and nuclear motion
inside the atom, subatomic structures, etc.) and *N* parts (,
e.g., different cells in a crystals, individual molecules, atoms or electrons,
or spatially different locations). It should be greater than the sum of
the individual parts at all the hierachical levels,

(32)

because any assembling and any interaction (coupling, etc.) among the parts or any interaction between the different hierachical levels will lead to information loss and entropy increase. Many terms of entropy due to interaction should be included in the entropy expression.

*Example 11.* Interaction of the two complementary
strands of polymer to form DNA. The combination leads to a more stable
structure. The system of these two components (part 1 and part 2) have
entropy greater than the sum of the individual parts ()
because .

Finally, we claim that due to interactions, the universe evolves towards maximum symmetry or minimum information.

**8. Dynamic and Static Symmetries**

There are two types of symmetries: *dynamic*
symmetry and *static* symmetry. Both are related to information loss
as schematically illustrated in figures 6 and 7. Dynamic entropy is the
logarithm of the number of the microstates of identical property (or identical
energy level in thermodynamics).

(a)(b)

*8.1. Dynamic Symmetry*

8.1.1. Fluid Systems

Because information is recorded in static structures,
systems of molecular dynamic motion have zero value of information and,
therefore, the logarithmic functions *L* and *S* have equal value
at the equilibrium state. In order to accommodate the kinetic energy of
the system, the structure of the system cannot stay in only one of many
accessible microstates. The macroscopic properties -
entropy and symmetry (homogeneity, isotropicity) of an idea gas used in
a heat engine can be considered first because the ideal gas model is most
important in thermodynamics.

In this context let us resolve Gibbs paradox of
entropy of mixing [21]. It says that the entropy of mixing *decreases
discontinuousl*y with an increase in similarity. It has a zero value
for mixing of the indistinguishable subsystems (figure 1a). The isobaric
and isothermal mixing of one mole of an ideal fluid A and one mole of a
different ideal fluid B (figures 5 and 8) has the entropy increment

(33)

where *R *is the gas constant, while

(34)

for the mixing of indistinguishable fluids [21]. It is assumed that the two equations (33) and 33) are also applicable to the formation of solid mixtures and liquid mixtures (citations in [7d]). Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics [23], quantum theory [24] and biophysics [21e]. It is certainly a most important problem in information theory if one intends to apply information theory to physics: e.g., Jaynes, the father of the maximal entropy principle [16], also considered this problem [21h]. The resolutions of this paradox have been very controversial for over one hundred years. Many famous physicists confidently claim that they have resolved this paradox, in very diverse and different ways. A sustaining problem is that they do not agree with one another. For example, besides numerous other resolutions [21], von Neumann [21j] provided a well-known quantum mechanical resolution of this paradox with which not many others are in full agreement [21d].

**Figure 8.** Ideal gas mixing in a rigid chamber.

Gibbs paradox is a problem regarding the relation of indistinguishability (symmetry) and entropy. It can be easily resolved if we recall the definition of labeling (section 2.3). For a rigid container, mixing two ideal gases is an identical process whether they are indistinguishable or distinguishable ideal gases, provided that the two gas chambers are parts of a rigid container (Because deformation changes the shape symmetry and entropy [7a], we suppose that the container is rigid. The interesting topic of shape symmetry evolution or deformation [7a] will be discussed in detail elsewhere): , , , and . As has been actually shown in experiment there is no change in the total energy, in the Gibbs free energy and there is no heat effect for the isobaric, isothermal mixing process. Therefore, the entropy change must be zero in both cases whether it is a mixing of the same ideal gases or of different ideal gases (equation 35):

(35)

8.1.3. Local Dynamic Motion in Solids (Crystals)

Another excellent example of local dynamic motion in a static structure is Pauling’s assessment of residual entropy of ice [7d, 24]. The spin-up and spin-down binary system is shown in figure 3. The information loss and symmetry increase due to local dynamic motion can be further illustrated by typewriting different fonts at one location, as shown in figure 4. In many cases, the formation of certain static periodic structures can be regarded as formatting (section 2.3).

*8.2. Static Symmetry*

If the temperature is gradually reduced, a system
of many molecules may become a static structure. There are many possible
static structures. In principle, for a condensed phase, the static structure
can be any of the *w* microstates accessible before the phase transition.
Symmetric static structure (crystal) and nonsymmetric static structure
must have different information (figures 6 and 7). Our theory predicts
that the most symmetric microstate will be taken as the most stable static
structure (the third law of information theory and the symmetry principle).
Before the phase transition at higher temperature, the system should be
of the highest possible dynamic symmetry (figure 9, for example). After
the phase transition to form a solid phase, the system should evolve to
the highest possible static symmetry (figure 10).

« «
« « «
« «

« «
« « «
« «

« «
« « «
« «

« «
« « «
« «

® ® ® ® ® ® ®

® ® ® ® ® ® ®

® ® ® ® ® ® ®

The expression of the entropy of mixing has been
applied in the same way to the mixing processes to form gaseous, fluid
or solid mixtures (see any statistical mechanics textbook). Therefore,
in the Ising model, the entropy of a noncrystal structure would have a
higher entropy due to the entropy of mixing *different* species according
to the traditional expression (figure 1a). However, the most symmetric
static structure (figure 10, for instance) has the highest static entropy
according to our third law. The present author believes that the traditional
way of calculating the entropy of mixing is the largest flaw in solid state
physics in particular and in physics in general. Prigogine’s dissipative
structure theory (which has been claimed to have solved such kind of symmetry
breaking problems [4, 25]), Wilson’s method of renormalization group [5,26]
and many other theories have been proposed. The symmetry breaking problem
remains to be solved. According to our theory, the static symmetry (and
the static entropy) should dominantly contribute to the stability of a
ferromagnetic system in the spin-parallel static state below the Curie
temperature. Due to the static symmetry, a perfect crystal has the highest
static entropy (the third law of information theory). This conforms to
perfectly the observed highest stability of crystal structure among all
the possible static (solid) structures. The perfect crystal structure is
equivalent to a newly formatted disk (hard disk or floppy disk) which has
the highest symmetry and the least (or zero) information (figure 2b). Noncrystal
solid structures are less stable (This prediction has already been confirmed
by an abundance of experimental observations [7]; ask any experimental
chemists or material scientists!).

**9. Phase Separation and Phase Transition (Symmetry
Breaking)**

When the thermodynamic temperature decreases, there will be a phase separation where different substances separate as a result of the spontaneous assembling of the indistinguishable substances.

*9.1. Similarity and Temperature*

Let us illustrate the relation of similarity in
thermodynamics and the thermodynamic temperature *T* with the simplest
example. The energy level similarity between two energy levels
and is calculated from
their Boltzmann factors and .
The similarity will approach the maximum if temperature increases and the
minimum if the temperature approaches zero:

(36)

(37)

Jaynes' maximal entropy principle [15a,16] may be applied to discuss the relation of similarity and a temperature-like parameter. Generally speaking, similarity increases with the increase in the absolute value of temperature (For a system of negative temperature where , similarity increases with the increase in ; see ref. 7a).

*9.2. Phase Separation, Condensation, and the
Densest Packing*

The similarity among components will decrease
at a reduced temperature (equation 37). Because a heterogeneous structure
with components of very different properties has high information and is
unstable, phase separation for a system of multiple components will follow
the similarity principle: *different components spontaneously separate.
The components of the same (or very similar) properties mix or merge to
form a homogeneous phase. The different components separate as a consequence
of the assembling of components of the most similar (or the same) properties.
*Spontaneous phase separation and its information loss (or symmetry
increase) as well as the opposite process can be illustrated in figure
11 [7d]. Therefore, if phase separation is desirable, we decrease the temperature
(equation 37); if the mixing of components is required, we increase the
temperature (equation 36).

Condensation from a gaseous phase and many examples of the densest molecular packing in the formation of crystal structures and other solid state structure can be explained by our revised information theory as a phase separation of a binary system of two "species": substance and vacuum. In this approximation, all molecules (not necessarily the same kind of molecule or one kind of molecule) can be taken as the species "1" and all the parts of free space are taken as species "0". At a reduced thermodynamic temperature, substance and vacuum both occupy the space, but otherwise are so different that they will separate as a consequence of assembling the same species together as two "phases", one is the condensed substance phase, the other is a bulky "phase" of vacuum (figure 11).

*9.3. Phase transition and Evolution of the
Universe and Evolution of Life*

The evolution of the Universe is a series of phase separations and phase transition. At extremely high temperature, even matter and antimatter can apparently coexist. At high temperature, the similarity is high (equation 36). When the temperature is reduced, the increase of the entropy can be achieved either by phase separation or phase transition as spontaneous processes. At reduced temperature, matter aggregates with matter. Then a mixture of matter and antimatter are not stable because their similarity is extremely low. The biosphere has only L-amino acids and D-sugars [27]. Our theory can shed light on the solution of the symmetry breaking phenomena during the universe evolution in general and the molecular evolution of life in particular. It is clear that at every stage of symmetry breaking, the critical phenomenon is characterized by the system’s tendency towards the highest possible dynamic symmetry at higher temperature and the highest possible static symmetry at lower temperature.

**10. Similarity Rule and Complementarity Rule**

Generally speaking, *all* intermolecular
processes (molecular recognition and molecular assembling or the formation
of any kinds of chemical bond) and intramolecular processes (protein folding
[28], etc.) between molecular moieties are governed either by the similarity
rule or by the complementarity rule or both.

Similarity rule (a component in a molecular recognition
process loves others of like properties, such as hydrophobic interaction,
p-stacking in DNA molecules, similarity in softness
of the well-known hard-soft-acid-base rules) predicts the affinity of individuals
of *similar* properties. On the contrary, complementarity rule predicts
the affinity of individuals of certain *different* properties. Both
types of rule still remain strictly empirical. The similarity rule can
be given a theoretical foundation by the similarity principle (figure 1c)
[7a] after rejection of Gibbs' (figure 1a) and revised (figure 1b) relations
of entropy- similarity.

All kinds of donor- acceptor interaction, such as enzyme and substrate combination, which may involve hydrogen bond, electrostatic interaction and stereochemical key-and-lock docking [30] (e.g., template and the imprinted molecular cavity [6b]), follow the complementarity rule. For the significance of the complementarity concept in chemistry, see the chapter on Pauling in the book [1b].

**Figure 12. The print and the imprint
[6b] are complementary.**

Firstly, for stereochemical interaction (key-and-lock docking) following
a complementarity rule it can be treated as a special kind of phase separation
where the substance and the vacuum (as a different species) separate (cf.
section 8). The final structure is more "complete", more integral, more
"solid" and more symmetric. More generally speaking, the components in
the structure of the final state become more similar due to the property
*offset* of the components and the structure is more symmetric. The
calculation of symmetry number, entropy and information changes during
molecular interaction are numerous topics for further studies.

Normally we consider complementarity of a binary system (two partners).
However, it can be an interaction among many components. The component
motifs are *distinguishable* (for a binary system they are *contrast*)
in the considered property or properties.

Chemical bond formation and all kinds of other interaction following
the similarity and complementarity rules will lead to more stable *static*
structures. Actually there are also similarity and complementarity rules
for the formation of a stable *dynamic* system. The resonance theory
is an empirical rule [29]. Von Neumann tried very hard to use the argument
of entropy of mixing but his final conclusion regarding entropy-
similarity relation (figure 1b) was wrong and cannot be employed to explain
the stability of quantum systems [21j]). Pauling's resonance theory can
be supported by the two principles - similarity
principle and complementarity principle for a dynamic system regarding
electron motion and electronic configuration [7a]. The final structure
as a hybrid or a mixture of a "complete" set (having all the microstates
or all the available canonical structures of very similar energy levels,
similar configuration, etc. see 1 and 2 in figure 13a) should be more symmetric.
The most complete set of microstates will lead to the highest dynamic entropy
and the highest stability. Other structures (e.g., 3 in figure 13a) cannot
contribute significantly to the structure because they are very different
from the structure of 1 or 2.

(b)

Finally we should emphasize that the similarity rule is always more
significant than the complementarity rule, because most properties of the
complementary components are the same or very similar (*m*>*l*).
Examples in chemistry are the HSAB (hard-soft-acid-base) rule where the
two components (acid and base) should have similar softness or similar
hardness (see any modern texts in inorganic chemistry). Another example
is the complementary pair of LUMO (lowest unoccupied molecular orbital)
and HOMO (highest occupied MO) where the energy levels of the MO are very
close (see a modern textbook of organic chemistry). The formation of a
chemical bond can be illustrated by a stable marriage (example 13).

The beauty of periodicity (repetition) or translational symmetry of molecular packing in crystals must be attributed to the corresponding stability. Formatting of a floppy disk or a hard disk will create symmetry and erasure of all the information. Formatting is a necessary preparation for stable storage of information where certain kinds of symmetry are maintained: e.g., books must be "formatted" by pages and lines where periodicity is maintained as background. The DNA molecules are "formatted" by the sugar and phosphate backbone as periodic units.

Steady-state is a special case of a dynamic system. Its stability depends on the symmetry (periodicity in time). The long period of chemical oscillation [1b,4] happens in a steady-state system. Cars on a highway must run in the same direction at the same velocity. Otherwise, if one car goes much slower, there will be a traffic accident (Curie said "nonsymmetry leads to phenomena" [2]), and the unfortunate phenomenon is the collision. Many kinds of cycle (heart-beating cycle, sleep-and-wake-up cycles, etc.) are the periodicity in time, which contributes stability, very much the same as the Carnot cycle of an ideal heat engine. However, exact repetition of everyday life must be very boring, even though it makes life simple or easy and stable (vida infra).

**12. Further Discussions: Beautiful Symmetry and Ugly Symmetry [31]**

To stress the important points of the present theory and to apply it in a relaxed manner to much more general situations, let us follow the style of discussing symmetry concept in a recent book [1b] and clarify the relation of symmetry with other concepts. Whether symmetry is beauty is of particular interest as the relationship of symmetry and beauty has been the subject of serious scientific research [32].

*12.1. Order and Disorder*

According to common intuition, the standard of beauty is that beauty correlates with less chaos or more order which in turn correlates with more information and less entropy. Our discussions of the following will be based on this general correlation of beauty- orderliness- information. Therefore, a consideration of this problem may be reduced to a proper definition of order (table 1) and disorder.

**Table 1.** Two definitions of order.

Definition | Order = periodicity or symmetry | Order = nonsymmetry and difference |

Formation | Generated spontaneously | Not spontaneously |

Examples | Chemical oscillation (symmetry and periodicity in time) or crystals (symmetry and periodicity in space) | Gas A stays in the left chamber and B in the right chamber to follow the order as shown in figure 5 |

Reference | "Order Out of Chaos" [4] | The present work |

Comment | Challenges the validity of the second law | Conforms to the second law |

However, unfortunately there are two totally different definitions of order or orderliness. Because they are opposite in meaning, it is only possible for us to conduct a meaningful discussion in chemistry and physics if only the suitable one of the two definitions is taken and consistently used. The first one regards "ordered" pattern as of the "order out of chaos" through "self-organization" [4]. Because the system is allowed to evolve spontaneously [4] the organized structure must be closer to an equilibrium state. It is well known that a system closer to equilibrium has a higher value of entropy of the universe or of an isolated system. It follows that a more "ordered" system has more entropy [4]. This kind of misconception is due to the correlation of higher entropy with lower symmetry because symmetry has been regarded as order. In this definition of order [4], order is the periodicity or symmetry generated spontaneously or through a "self"-action. This conclusion obviously violates the second law of thermodynamics [4], as commented by the present author [33].

In the other and the proper definition of order, order is generated and maintained by the surroundings (including the experimenter or information register and their equipment, etc.) [7d]. This order is achieved by applying confinements or constraints to the concerned system such as containers to restrict molecules spatially, or applying forces to produce distinguishability (and information). One example is shown in figure 5 where the two parts are confined and they are distinguishable. The social order is generated and maintained by applying discipline to its members to create distinguishability. Without discipline, if everyone in a country claims to be as powerful as the president and behaves in the same manner, the permutation symmetry is high and the society is in chaos. If two or more book locations are changed and the order of the library is unchanged, there is permutation symmetry in this library. This library either has many (to say 10000) copies of the same book, or is in a totally chaotic situation. Some years ago, when the present author did experiments using a refrigerated room where thousands of compounds prepared by the professor's former students were stored, he decided that symmetry was by no means order: If one brings any symmetry to the storeroom and accordingly reshapes the professor's nonsymmetrical arrangement of the precious samples (most of them are themselves asymmetric molecules) to "make order", the professor would react angrily. Now the present author is supervising a sample archive center [34] and has an even stronger picture that the archive order of the samples in a stock room is generated and maintained by confining samples to distinguishable containers and locations.

The statement that "a static structure (not necessarily
a crystal), which is a frozen dynamic structure, is more orderly than a
system of dynamic motion" is correct. However, when two *static* structures
are directly compared, the saying that "a symmetric static structure has
more order or is more orderly than a nonsymmetrical static structure" will
be completely incorrect and misleading if disorder has been defined as
entropy or information loss.

As mentioned in section 8.2 and 9, for stability reasons, a system applied for storing information always has symmetries for certain kinds of property (e.g., the periodicity of the sugar-phosphate backbone for DNA in example 6). In most cases information is recorded by using only one of several properties. The symmetric structure regarding other properties can be taken as formatting (section 2.3). The properly defined order or the genetic information in a DNA is recorded by the nonsymmetric structural feature (the 4 bases), not at all by the sugar or phosphate which are responsible for the symmetry. Strictly speaking, symmetry is not order; it is chaos.

*12.2. Symmetry and Diversity*

Intuitively, it can be easily understood that
entropy and symmetry increase together if we simply use some common sense:
it is indistinguishable (which is symmetry, see figure 6); and there is
no information (or there is a large amount of entropy)! Whereas a similarity
(*Z*) of a system can be defined as

(19)

the diversity of the system (*D*) can be
defined as

(38)

For more details, see ref. 7c-7e. Why are diversities
[7c], such as molecular diversity, biodiversity, cultural diversity, etc.
appreciated by many people, and why is diversity beautiful? The new theory
gives a clear answer. If a collection of 10000 stamps has extremely high
similarity, which means that all of them are of the same figure and size,
even though this kind of stamp might be by itself very interesting, the
permutation symmetry is obvious: If the positions of any two stamps are
exchanged, the album of stamps remains the same. This collection is not
beautiful and not precious because it lacks diversity. A library of 10000
copies of the same book is not nice either. Based on this idea, as a chemist,
the present author initiated MDPI [34], an international project for collecting
diverse chemical samples to build up molecular diversity. He has not collected
10000 samples of the same compound, even though it might be very interesting,
e.g., the famous C_{60} molecule. The molecule C_{60} is
itself beautiful not because of its symmetry, but because of its distinct
structure and property compared to many other molecules and its contribution
to the diversity of molecules we have studied.

For similar reason, it is true that we may feel a symmetric object beautiful if we have many other objects of less symmetry in our collection or in our experience. This symmetric object contributes to the diversity.

Diversity is beautiful. Symmetry is not. Coffee with sugar and cream is an interesting drink because of its diversity in taste (sweet and bitter) and color (white and dark). Diversity in a mixture makes the so-called high throughput screening of bioactivity testing possible [7c]. A country (e.g., USA or Switzerland) is considered a nice place because of its tolerance to all kinds of diversity (racial diversity, cultural diversity, religious diversity, etc.). Without such appreciation of diversity, this country would become much less colorful and less beautiful. If everyone behaves the same as you, looks the same as you, and there is a lot of symmetry, the world would be truly ugly. Democracy might be regarded as a sort of social or political diversity.

Unfortunately a system with high diversity is
less stable. Synthetic organic chemists know that samples with impurities
are less stable than highly purified samples. A mixture of thousands of
compounds from a combinatorial synthesis is stable because the properties
of the compounds are very similar and they normally belong to one type
of molecule with the same parent structure. A mixture of acid and base
is not stable. Storing a bottle of HCl and a bottle of NH_{3} together
is not a good idea.

A useful or beautiful system (a collection) should have a compromise between stability (due to indistinguishability or permutation symmetry and entropy) and diversity (due to distinguishability, nonsymmetry, diversity and information). As we have discussed before on complementarity (section 9 and the example), there can be many components or individuals that are complementary. A system satisfying this kind of complementarity is the most beautiful one with the most stability due to indistinguishability (symmetry) and the information due to distinguishability (nonsymmetry). A stable diversity is beautiful and practically useful for all kinds of diversity preservation [7c].

*12.3. Ugly Symmetry*

12.3.1. Assembling

Symmetry has been defined as beauty in many books
and research papers [3, 8, 31]. According to the *Concise Oxford Dictionary*,
symmetry is beauty resulting from the right proportion between the parts
of the body or any whole, balance, congruity, harmony, keeping (page 1
of ref. 8).

Almost every modern arts museum collects a few paintings which are very symmetric, sometimes completely indistinguishable between the parts with the same pure white color (or gray or other color) overall (figure 6a). Symmetry is beautiful? OK, that's it! Why not? It is ugly because the emperor is topless and bottomless, even though many great intellectuals can cheerfully enjoy his most beautiful clothes (figure 6a). The most symmetric and most fussy paintings of those post-impressionists are the emperor's new clothes.

Crystals might be more beautiful than fluids because a solid structure has obviously lost all the dynamic symmetry such as isotropicity (see figure 6a). In a crystal, isotropicity does not exist: the properties along various directions become different (see figure 6b). This might be the reason why we like chocolate or ice cream in a certain form and may not like melted chocolate or runny ice cream soup.

12.3.2. Individual Items

High speed motion around a fixed point and a fixed axis will create a figure of spherical and cylindrical symmetry, respectively. The electronic motion in a hydrogen atom around the proton creates a spherical shape for the hydrogen atom. These are many examples showing that the dynamic motion of a system results in an increased symmetry. These systems are stable but not necessarily beautiful. For example, a photo showing the face and shape of a girl figure skater with no symmetry is more beautiful than a photo of a mixture of the images facing many directions, or a mixture of structures at different angular displacements.

The famous molecule buckminsterfullerene
C_{60} is symmetric [1b] and very stable. We can predict that any
modified C_{60} mother structure with reduced symmetry will be
less stable than the symmetric C_{60}.

Symmetric static structure is stable but not necessarily beautiful. A beautiful model, whether she stands or sits, never poses in a symmetric style before the crowd. A fit and beautiful body differs from the more symmetric, spherical shape of an over weight body. A guard in front of a palace stands in a symmetric way to show more stability (and strength). A Chinese empress sits in a symmetry style because stability is more interesting to her. The pyramid in Egypt is stable due to its symmetry. However, it is by no means the most beautiful construction.

Information will be lost if the ink is extremely faint or the same color as the background. A crystal is more stable than a noncrystal solid because the former has high static symmetry, a perfect symmetry without any information (figure 6b). Children understand it: on the walls of children's classrooms and bedrooms a visitor may find a lot of paintings, drawings and even scrawls done by the kids. If you do not put some colorful drawings there, children will create them themselves (e.g., figure 7b). The innocent children like to get rid of any ugly symmetry surrounding them. If your kids destroy symmetry on the walls, they are doing well.

Perfect symmetry is boring [32,1b], isn’t it? The very symmetric female face of oriental beauty [32] can be much more attractive if it is made less symmetric by a nonsymmetric hairstyle or by a nonsymmetric smile. The combination of certain symmetry (or symmetries) contributing stability according to our theory and certain nonsymmetry (or distinguishability, contributing certain interesting information) is the ideal beauty. The earth (figure 2) is beautiful because of its combination of symmetry and nonsymmetry. If it were of perfect symmetry and we could not distinguish North America, Europe or Asia, the world would be a deadly boring planet.

*12.4. "Symmetry Is Beauty" Has Been Misleading
in Science*

Beauty and its relation to symmetry has been a
topic of serious scientific investigations [32]. This is fine. However,
it has been widely believed that if the beholder is a physicist or a chemist,
beauty also means symmetry. This may have already practically misled scientific
research funding, publication and recognition. The situation in chemistry
may be mentioned. Experimental chemists doing organic or inorganic synthesis
may have more difficulty in synthesizing a specific structure of less symmetry
than that of a highly symmetric one. However, a large number of highly
symmetric structures can be much more easily published in prestigious journals
(*Angewandt Chemie*, for example), because these molecular structures
have been taken as more beautiful. Consequently other synthetic chemists
have been regarded as intellectually lower achievers than those preparing
highly symmetric structures. It has been shown by the history of chemical
science and demonstrated by the modern arts of chemistry, particularly
organic synthesis, that chemists endeavor to seek for asymmetry related
to both space and time (nonsymmetry is the cause of phenomena [2]) not
at all for symmetry [31]. The symmetric buckminsterfullerene
C_{60 }is beautiful. However, many derivatives of C_{60}
have been synthesized by organic chemists. These derivatives are less symmetric,
more difficult to produce and might be more significant. None of the drugs
(pharmaceuticals) discovered so far are very symmetric. Very few bioactive
compounds are symmetric. Because all of the most important molecules of
life, such as amino acids, sugars, and nucleic acids, are asymmetric, we
can also attribute beauty to those objects that are practically more difficult
to create and yet practically more significant. The highest symmetry means
equilibrium in science and death in life [31].

Sometimes, even graphic representation and illustration can be biased by the authors to create false symmetry. One example is shown in figure 14.

The structural stability criteria of symmetry maximization can be applied to predict all kinds of symmetry evolution. We have clarified the relation of symmetry to several other concepts, namely higher symmetry, higher similarity, higher entropy, less information and less diversity and they are all related to higher stability. This lays the very necessary foundation for further studies in understanding the nature of the chemical process.

*Acknowledgements:* The author is very grateful
to Dr. Kurt E. Geckeler, Dr. Peter Ramberg, Dr. S. Anand Kumar, Dr. Alexey
V. Eliseev and Dr. Jerome Karle and numerous other colleagues for their
kind invitations to give lectures on the topics of ugly symmetry and beautiful
diversity. Dr. Istvan Hargittai and Dr. J. Rosen kindly reviewed this paper.

**References and Notes**

2. (a) Rosen, J. *Symmetry in Science*; Springer:
New York, **1995**. (b) Rosen*, *J*. A Symmetry Primer for Scientists*;
Wiley: New York, **1983.** (c) A book review: Lin, S.-K. *Entropy*
**1999**, *1*, 53-54. (http://www.mdpi.org/entropy/htm/e1030053.htm).

3. Weyl, H. *Symmetry*; Paperback Reprint
edition, Princeton University Press: New York, 1989.

4. Prigogine, I.; Stengers, I. *Order out of
Chaos*; Heinemann: London, 1984.

5. Ma, S. K. *Modern Theory of Critical Phenomena*;
W. A. Benjamin: London, 1976.

6. (a) Chemistry can be comprehended as an *information
science*, see: Lehn, J. M. Some Reflections on Chemisry-Molecular, Supramolecular
and Beyond. In *Chemistry for the 21st Century*; Keinan, E.; Schechter,
I., Eds.; Wiley-VCH: Weinheim, 2001; pp 1-7. (b) Günter, W. Templated
Synthesis of Polymers – Molecularly Imprinted Materials for Recognition
and Catalysis. In *Templated Organic Synthesis*; Diederich, F.; Stang,
P. J., Eds.; Wiley-VCH: Weinheim, 2000; pp 39-73. (c) A book review: Lin,
S.-K. *Molecules ***2000**,* 5*,* *195-197. (http://www.mdpi.org/molecules/html/50200195.htm).

7. (a) Lin, S.-K. Understanding structural stability
and process spontaneity based on the rejection of the Gibbs paradox of
entropy of mixing. *J. Mol. Struct.* *Theorochem ***1997**,
*398*, 145-153. (http://www.mdpi.org/lin/lin-rpu.htm) (b) Lin, S.-K.
Gibbs paradox of entropy of mixing: Experimental facts, its rejection,
and the theoretical consequences. *EJ. Theor. Chem.* **1996**,
*1*, 135-150. (http://www.mdpi.org/lin/lin-rpu.htm) (c) Lin, S.-K.
Molecular Diversity Assessment: Logarithmic Relations of Information and
Species Diversity and Logarithmic Relations of Entropy and Indistinguishability
After Rejection of Gibbs Paradox of Entropy of Mixing. *Molecules*
**1996**, *1*, 57-67. (http://www.mdpi.org/lin/lin-rpu.htm) (d)
Lin, S.-K. Correlation of Entropy with Similarity and Symmetry. *J. Chem.
Inf. Comp. Sci.* **1996**, *36*, 367-376. (http://www.mdpi.org/lin/lin-rpu.htm)
(e) Lin, S.-K. *Diversity Assessment Based on a Higher Similarity-Higher
Entropy Relation after Rejection of Gibbs Paradox*. Preprint, physics/9910032;
1999. (http://xxx.lanl.gov/abs/physics/9910032)

8. Elliott, J. P.; Dawber, P. G. *Symmetry in
Physics*; Macmilan: London, 1979.

9. Clausius, R. *Annalen der Physik und Chemie*
**1865**, *125*, 353-400. For its English translation, see: Magie,
W. F. A *Source Book in Physics*; McGraw-Hill: New York, 1935**.**
(http://webserver.lemoyne.edu/ faculty/giunta/clausius.html)

10. MacKay’s e-book on information theory is very
readable, see: MacKay, D. J. C. *Information Theory, Inference, and Learning
Algorithms*; Draft 2.2.0, 2000**.** (http://wol.ra.phy.cam.ac.uk/
mackay/MyBooks.html)

11. (a) Shannon, C. E. *Bell Sys. Tech. J. ***1948**,
*27*, 379-423; 623-656. (b) Shannon, C.E. *Mathematical Theory of
Communication. *(http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html)

12. Lewis, G. N. *Science* **1930**, *71*,
569.

13. Watson, J. D. *The Double Helix*; The
New American Library: New York, 1968.

14. Halliday, D.; Resnick, R.; Walker, J. *Fundamentals
of Physics*; 4th Ed., J. Wiley & Sons, Ltd.: New York, 1993.

15. (a) Chowdhury, D.; Stauffer, D. *Principles
of Equilibrium Statistical Mechanics*; Wiley-VCH: Weinheim, 2000; pp
158-159. (b) Lambert, F. L. *J. Chem. Educ. ***1999**, *76*,
1385. (c) Denbigh, K. G.; Denbigh, J. S. *Entropy in Relation to Incomplete
Know*ledge; Cambridge University Press: London, 1985.

16. Jaynes, E. T. *Phys. Rev.* **1957**,
*106*, 620-630.

17. Brillouin, L. *Science and Information Theory*;
2nd Ed., Academic Press: New York, 1962.

18. Wehrl, A. *Rev. Mod. Phys.* **1978**,
*50*, 221-260.

19. Atkin, P. W. *Physical Chemistry*; 4th
Ed., Oxford University Press: London, 1990; p 324.

20. (a) Zabrodsky, H.; Peleg, S.; Avnir, D. J. Am. Chem. Soc. 1992, 114, 7843-7851. (b) Zabrodsky, H.; Peleg, S.; Avnir, D. J. Am. Chem. Soc. 1993, 115, 8278-8289. (c) Zabrodsky, H.; Avnir, D. J. Am. Chem. Soc. 1995, 117, 462-473.

21. (a) Lesk, A. M. On the Gibbs paradox: what
does indistinguishability really mean? *J. Phys. A: Math. Gen.* **1980**,
*13*, L111-L114. (b) van Kampen, N. G. *The Gibbs Paradox*, In
*Essays in Theoretical Physics*; Parry, W. E., Ed.; Pergamon: Oxford,
1984; 303-312. (c) Kemp, H. R. Gibbs' paradox: Two views on the correction
term. *J*. *Chem*. *Educ*. **1986**, *63*, 735-736.
(d) Dieks, D.; van Dijk, V. Another look at the quantum mechanical entropy
of mixing. *Am. J. Phys.* **1988**, *56*, 430-434. (e) Richardson,
I. W. The Gibbs paradox and unidirectional fluxes. *Eur. Biophys. J.*
**1989**, *17*, 281-286. (f) Lin, S.-K. Gibbs paradox and its resolutions.
*Ziran Zazhi* **1989, ***12*, 376-379. (g) Wantke, K. -D.
A remark on the Gibbs-paradox of the entropy of mixing of ideal gases.
*Ber. Bunsen-Ges. Phys. Chem. ***1991**, *94*, 537. (h) Jaynes,
E. T. The Gibbs paradox. In *Maximum Entropy and Bayesian Methods*;
Smith, C. R.; Erickson, G. J.; Neudorfer, P. O. Eds.; Kluwer Academic:
Dordrecht, 1992; pp 1-22. (i) Blumenfeld, L. A.; Grosberg, A. Y. Gibbs
paradox and the notion of construction in thermodynamics and statistic
physics. *Biofizika ***1995**, *40*, 660-667. (j) von Neumann,
J. *Mathematical Foundations of Quantum Mechanics*; Princeton University
Press: Princeton, 1955; Ch. 5. (k) Lin, S.-K. *Gibbs paradox of Entropy
of Mixing Website*. (http://www.mdpi.org/entropy/entropyweb/gibbs-paradox.htm)

22. Schrödinger, E. *Statistical Mechanics*;
University Press: London, 1952; p 58.

23. Pauli, W. *Pauli Lectures on Physics: Vol.3,
Thermodynamics and the Kinetic Theory of Gases*; MIT Press: Cambridge,
MA, 1973; p 48.

24. Pauling, L. The Structure and Entropy of Ice
and of Other Crystals with Some Randomness of Atomic Arrangement.** ***J.
Am. Chem. Soc.* **1935**, *57*, 2680-2684.

25. (a) Kondepudi, D.; Prigogine, I. *Modern
Thermodynamics: From Heat Engines to Dissipative Structures*; Wiley:
Chichester, UK, 1998; chapters 19-20. (b) A book review: Lin, S. -K. *Entropy*
**1999****, ***1*,
148-149. (http://www.mdpi.org/entropy/htm/e1040148.htm)

26. Wilson, K. *Rev. Mod. Phys*. **1983**, *55*, 583-600.

27. Frank, P.; Bonner, W. A.; Zare, R. N. On One Hand But Not the Other:
The Challenge of the Origin and Survival of Homochirality in Prebiotic
Chemistry. In *Chemistry for the 21st Century*; Keinan, E.; Schechter,
I., Eds.; Wiley-VCH: Weinheim, 2001; pp 173-208.

28. Duan, Y.; Harvey, S. C.; Kollman, P. A. Protein Folding and Beyond.
In *Chemistry for the 21st Century*; Keinan, E.; Schechter, I., Eds.;
Wiley-VCH: Weinheim, 2001; pp 89-101.

29. (a) The resonance theory is an empirical rule because the chemical
bond stability has never been explained. See: Pauling, L. *The Nature
of the Chemical Bond*; 3rd ed., Cornell University: Ithaca, 1966. (b)
It should be recalled that resonance is a term used in physics, where the
oscillation is the amplitude of the induced oscillation is greatest if
the angular frequency of the driving force is the same as the natural frequency
of an oscillator [14].

30. Fischer, E. *Ber. Dtsch. Chem. Ges.* **1894**, *27*,
2985-1993.

31. Adapted from the lectures of the present author, e.g., see: Lin,
S.-K. *Ugly Symmetry*; Lecture at the 218th ACS national meeting in
New Orleans, Louisiana, August 22-26, 1999. (http://www.mdpi.org/lin/uglysym1.htm)

32. E.g., see: (a) Perrett, D. I.; May, K. A.; Yoshikawa, S. *Nature*
**1994**, *368*, 239-241. (b) Enguist, M.; Arak, A. *Nature*
**1994**, *372*, 167-175.

33. Prigogine indeed questioned the validity of the second law of thermodynamics,
see also an editorial: Lin, S.-K. *Entropy*
**1999****, ***1,
*1-3. (http://www.mdpi.org/entropy/htm/e1010001.htm)

34. (a) See the http://www.mdpi.org website. (b) An editorial: Lin,
S.-K. *Molecules* **1997**, *2*, 1-2. (http://www.mdpi.org/molecules/edito197.htm)

35. (a) Lin, S.-K.; Patiny, L.; Yerin, A.; Wisniewski, J. L.; Testa,
B. One-wedge convention for stereochemical representations. *Enantiomer*
**2000**, *5*, 571-583. (http://www.mdpi.org/molecules/ wedge.htm)
(b) Juaristi, E.; Welch, C. J. *Enantiomer* **1997**, *2*,
473-474; **1999**, *4*, 483-484. (c) Lin, S.-K. A proposal for
the representation of the stereochemistry of quatrivalent centres. *Chirality*,
**1992**, *4*, 274-278. (d) Gal, J. Rootworm pheromones – The Root
of a stereochemical mixup, *Chirality* **1992**, *4*, 68-70.
(e) Testa, B. On flying wedges, crashing wedges, and perspective-blind
stereochemists. *Chirality* **1991**, *3*, 159-160. (f) Simonyi,
M.; Gal, J.; Testa, J. *Trends Pharmacol. Sci.* **1989**, *10*,
349-354.

Note: Figure 12 is not correctly displayed. For the correct figure, see the official pdf format: http://www.mdpi.org/ijms/papers/i2010010.pdf