Measure of distance to normality
In
information theory
and
statistics
,
negentropy
is used as a measure of distance to normality. The concept and phrase "
negative entropy
" was introduced by
Erwin Schrodinger
in his 1944 popular-science book
What is Life?
[1]
Later,
French
physicist
Leon Brillouin
shortened the phrase to
neguentropie
(negentropy).
[2]
[3]
In 1974,
Albert Szent-Gyorgyi
proposed replacing the term
negentropy
with
syntropy
. That term may have originated in the 1940s with the Italian mathematician
Luigi Fantappie
, who tried to construct a unified theory of
biology
and
physics
.
Buckminster Fuller
tried to popularize this usage, but
negentropy
remains common.
In a note to
What is Life?
Schrodinger explained his use of this phrase.
... if I had been catering for them [physicists] alone I should have let the discussion turn on
free energy
instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to
energy
for making the average reader alive to the contrast between the two things.
Information theory
[
edit
]
In
information theory
and
statistics
, negentropy is used as a measure of distance to normality.
[4]
[5]
[6]
Out of all
distributions
with a given mean and variance, the normal or
Gaussian distribution
is the one with the highest
entropy
. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes
if and only if
the signal is Gaussian.
Negentropy is defined as
![{\displaystyle J(p_{x})=S(\varphi _{x})-S(p_{x})\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/24c2408865833917de154c1e12f9da9ef5102c5a)
where
is the
differential entropy
of the Gaussian density with the same
mean
and
variance
as
and
is the differential entropy of
:
![{\displaystyle S(p_{x})=-\int p_{x}(u)\log p_{x}(u)\,du}](https://wikimedia.org/api/rest_v1/media/math/render/svg/eef1d3a2e1909fbecaffd37d909baffe28e0733d)
Negentropy is used in
statistics
and
signal processing
. It is related to network
entropy
, which is used in
independent component analysis
.
[7]
[8]
The negentropy of a distribution is equal to the
Kullback?Leibler divergence
between
and a Gaussian distribution with the same mean and variance as
(see
Differential entropy § Maximization in the normal distribution
for a proof). In particular, it is always nonnegative.
Correlation between statistical negentropy and Gibbs' free energy
[
edit
]
Willard Gibbs
’ 1873
available energy
(
free energy
) graph, which shows a plane perpendicular to the axis of
v
(
volume
) and passing through point A, which represents the initial state of the body. MN is the section of the surface of
dissipated energy
. Qε and Qη are sections of the planes
η
= 0 and
ε
= 0, and therefore parallel to the axes of ε (
internal energy
) and η (
entropy
) respectively. AD and AE are the energy and entropy of the body in its initial state, AB and AC its
available energy
(
Gibbs energy
) and its
capacity for entropy
(the amount by which the entropy of the body can be increased without changing the energy of the body or increasing its volume) respectively.
There is a physical quantity closely linked to
free energy
(
free enthalpy
), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873,
Willard Gibbs
created a diagram illustrating the concept of free energy corresponding to
free enthalpy
. On the diagram one can see the quantity called
capacity for entropy
. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume.
[9]
In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by
Massieu
for the
isothermal process
[10]
[11]
[12]
(both quantities differs just with a figure sign) and then
Planck
for the
isothermal
-
isobaric
process.
[13]
More recently, the Massieu?Planck
thermodynamic potential
, known also as
free entropy
, has been shown to play a great role in the so-called entropic formulation of
statistical mechanics
,
[14]
applied among the others in molecular biology
[15]
and thermodynamic non-equilibrium processes.
[16]
![{\displaystyle J=S_{\max }-S=-\Phi =-k\ln Z\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4b41ebd3529c9c8c0f9571ef1e04ee4c6f17e284)
- where:
is
entropy
is negentropy (Gibbs "capacity for entropy")
is the
Massieu potential
is the
partition function
the
Boltzmann constant
In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the
convex conjugate
of
LogSumExp
(in physics interpreted as the free energy).
Brillouin's negentropy principle of information
[
edit
]
In 1953,
Leon Brillouin
derived a general equation
[17]
stating that the changing of an information bit value requires at least
energy. This is the same energy as the work
Leo Szilard
's engine produces in the idealistic case. In his book,
[18]
Brillouin further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
See also
[
edit
]
Notes
[
edit
]
- ^
Schrodinger, Erwin,
What is Life ? the Physical Aspect of the Living Cell
, Cambridge University Press, 1944
- ^
Brillouin, Leon: (1953) "Negentropy Principle of Information",
J. of Applied Physics
, v.
24(9)
, pp. 1152?1163
- ^
Leon Brillouin,
La science et la theorie de l'information
, Masson, 1959
- ^
Aapo Hyvarinen,
Survey on Independent Component Analysis, node32: Negentropy
, Helsinki University of Technology Laboratory of Computer and Information Science
- ^
Aapo Hyvarinen and Erkki Oja,
Independent Component Analysis: A Tutorial, node14: Negentropy
, Helsinki University of Technology Laboratory of Computer and Information Science
- ^
Ruye Wang,
Independent Component Analysis, node4: Measures of Non-Gaussianity
- ^
P. Comon, Independent Component Analysis ? a new concept?,
Signal Processing
,
36
287?314, 1994.
- ^
Didier G. Leibovici and Christian Beckmann,
An introduction to Multiway Methods for Multi-Subject fMRI experiment
, FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.
- ^
Willard Gibbs,
A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces
,
Transactions of the Connecticut Academy
, 382?404 (1873)
- ^
Massieu, M. F. (1869a). Sur les fonctions caracteristiques des divers fluides.
C. R. Acad. Sci.
LXIX:858?862.
- ^
Massieu, M. F. (1869b). Addition au precedent memoire sur les fonctions caracteristiques.
C. R. Acad. Sci.
LXIX:1057?1061.
- ^
Massieu, M. F. (1869),
Compt. Rend.
69
(858): 1057.
- ^
Planck, M. (1945).
Treatise on Thermodynamics
. Dover, New York.
- ^
Antoni Planes, Eduard Vives,
Entropic Formulation of Statistical Mechanics
Archived
2008-10-11 at the
Wayback Machine
, Entropic variables and Massieu?Planck functions 2000-10-24 Universitat de Barcelona
- ^
John A. Scheilman,
Temperature, Stability, and the Hydrophobic Interaction
,
Biophysical Journal
73
(December 1997), 2960?2964, Institute of Molecular Biology, University of Oregon, Eugene, Oregon 97403 USA
- ^
Z. Hens and X. de Hemptinne,
Non-equilibrium Thermodynamics approach to Transport Processes in Gas Mixtures
, Department of Chemistry, Catholic University of Leuven, Celestijnenlaan 200 F, B-3001 Heverlee, Belgium
- ^
Leon Brillouin, The negentropy principle of information,
J. Applied Physics
24
, 1152?1163 1953
- ^
Leon Brillouin,
Science and Information theory
, Dover, 1956
Look up
negentropy
in Wiktionary, the free dictionary.