Concept in information theory
In
information theory
, the
Renyi entropy
is a quantity that generalizes various notions of
entropy
, including
Hartley entropy
,
Shannon entropy
,
collision entropy
, and
min-entropy
. The Renyi entropy is named after
Alfred Renyi
, who looked for the most general way to quantify information while preserving additivity for independent events.
[1]
[2]
In the context of
fractal dimension
estimation, the Renyi entropy forms the basis of the concept of
generalized dimensions
.
[3]
The Renyi entropy is important in ecology and statistics as
index of diversity
. The Renyi entropy is also important in
quantum information
, where it can be used as a measure of
entanglement
. In the Heisenberg XY spin chain model, the Renyi entropy as a function of
α
can be calculated explicitly because it is an
automorphic function
with respect to a particular subgroup of the
modular group
.
[4]
[5]
In
theoretical computer science
, the min-entropy is used in the context of
randomness extractors
.
Definition
[
edit
]
The Renyi entropy of order
, where
and
, is defined as
[1]
It is further defined at
as
Here,
is a discrete random variable with possible outcomes in the set
and corresponding probabilities
for
. The resulting
unit of information
is determined by the base of the
logarithm
, e.g.
shannon
for base 2, or
nat
for base
e
.
If the probabilities are
for all
, then all the Renyi entropies of the distribution are equal:
.
In general, for all discrete random variables
,
is a non-increasing function in
.
Applications often exploit the following relation between the Renyi entropy and the
α
-norm
of the vector of probabilities:
- .
Here, the discrete probability distribution
is interpreted as a vector in
with
and
.
The Renyi entropy for any
is
Schur concave
. Proven by the
Schur-Ostrowski criterion.
Special cases
[
edit
]
As
approaches zero, the Renyi entropy increasingly weighs all events with nonzero probability more equally, regardless of their probabilities. In the limit for
, the Renyi entropy is just the logarithm of the size of the support of
X
. The limit for
is the
Shannon entropy
. As
approaches infinity, the Renyi entropy is increasingly determined by the events of highest probability.
Hartley or max-entropy
[
edit
]
Provided the probabilities are nonzero,
[6]
is the logarithm of the
cardinality
of the alphabet (
) of
, sometimes called the
Hartley entropy
of
,
Shannon entropy
[
edit
]
The limiting value of
as
is the
Shannon entropy
:
[7]
Collision entropy
[
edit
]
Collision entropy
, sometimes just called "Renyi entropy", refers to the case
,
where
X
and
Y
are
independent and identically distributed
. The collision entropy is related to the
index of coincidence
.
Min-entropy
[
edit
]
In the limit as
, the Renyi entropy
converges to the
min-entropy
:
Equivalently, the min-entropy
is the largest real number
b
such that all events occur with probability at most
.
The name
min-entropy
stems from the fact that it is the smallest entropy measure in the family of Renyi entropies.
In this sense, it is the strongest way to measure the information content of a discrete random variable.
In particular, the min-entropy is never larger than the
Shannon entropy
.
The min-entropy has important applications for
randomness extractors
in
theoretical computer science
:
Extractors are able to extract randomness from random sources that have a large min-entropy; merely having a large
Shannon entropy
does not suffice for this task.
Inequalities for different orders
α
[
edit
]
That
is non-increasing in
for any given distribution of probabilities
,
which can be proven by differentiation,
[8]
as
which is proportional to
Kullback?Leibler divergence
(which is always non-negative), where
. In particular, it is strictly positive except when the distribution is uniform.
At the
limit, we have
.
In particular cases inequalities can be proven also by
Jensen's inequality
:
[9]
[10]
For values of
, inequalities in the other direction also hold. In particular, we have
[11]
[12]
On the other hand, the Shannon entropy
can be arbitrarily high for a random variable
that has a given min-entropy. An example of this is given by the sequence of random variables
for
such that
and
since
but
.
Renyi divergence
[
edit
]
As well as the absolute Renyi entropies, Renyi also defined a spectrum of divergence measures generalising the
Kullback?Leibler divergence
.
[13]
The
Renyi divergence
of order
α
or
alpha-divergence
of a distribution
P
from a distribution
Q
is defined to be
when
0 <
α
< ∞
and
α
≠ 1
. We can define the Renyi divergence for the special values
α
= 0, 1, ∞
by taking a limit, and in particular the limit
α
→ 1
gives the Kullback?Leibler divergence.
Some special cases:
- : minus the log probability under
Q
that
p
i
> 0
;
- : minus twice the logarithm of the
Bhattacharyya coefficient
; (
Nielsen & Boltz (2010)
)
- : the
Kullback?Leibler divergence
;
- : the log of the expected ratio of the probabilities;
- : the log of the maximum ratio of the probabilities.
The Renyi divergence is indeed a
divergence
, meaning simply that
is greater than or equal to zero, and zero only when
P
=
Q
. For any fixed distributions
P
and
Q
, the Renyi divergence is nondecreasing as a function of its order
α
, and it is continuous on the set of
α
for which it is finite,
[13]
or for the sake of brevity, the information of order
α
obtained if the distribution
P
is replaced by the distribution
Q
.
[1]
Financial interpretation
[
edit
]
A pair of probability distributions can be viewed as a game of chance in which one of the distributions defines official odds and the other contains the actual probabilities. Knowledge of the actual probabilities allows a player to profit from the game. The expected profit rate is connected to the Renyi divergence as follows
[14]
where
is the distribution defining the official odds (i.e. the "market") for the game,
is the investor-believed distribution and
is the investor's risk aversion (the
Arrow?Pratt relative risk aversion
).
If the true distribution is
(not necessarily coinciding with the investor's belief
), the long-term realized rate converges to the true expectation which has a similar mathematical structure
[14]
Properties specific to
α
= 1
[
edit
]
The value
α
= 1
, which gives the
Shannon entropy
and the
Kullback?Leibler divergence
, is the only value at which the
chain rule of conditional probability
holds exactly:
for the absolute entropies, and
for the relative entropies.
The latter in particular means that if we seek a distribution
p
(
x
,
a
)
which minimizes the divergence from some underlying prior measure
m
(
x
,
a
)
, and we acquire new information which only affects the distribution of
a
, then the distribution of
p
(
x
|
a
)
remains
m
(
x
|
a
)
, unchanged.
The other Renyi divergences satisfy the criteria of being positive and continuous, being invariant under 1-to-1 co-ordinate transformations, and of combining additively when
A
and
X
are independent, so that if
p
(
A
,
X
) =
p
(
A
)
p
(
X
)
, then
and
The stronger properties of the
α
= 1
quantities allow the definition of
conditional information
and
mutual information
from communication theory.
Exponential families
[
edit
]
The Renyi entropies and divergences for an
exponential family
admit simple expressions
[15]
and
where
is a Jensen difference divergence.
Physical meaning
[
edit
]
The Renyi entropy in quantum physics is not considered to be an
observable
, due to its nonlinear dependence on the density matrix. (This nonlinear dependence applies even in the special case of the Shannon entropy.) It can, however, be given an operational meaning through the two-time measurements (also known as full counting statistics) of energy transfers
[
citation needed
]
.
The limit of the quantum mechanical Renyi entropy as
is the
von Neumann entropy
.
See also
[
edit
]
Notes
[
edit
]
- ^
a
b
c
Renyi (1961)
- ^
Rioul (2021)
- ^
Barros, Vanessa; Rousseau, Jerome (2021-06-01).
"Shortest Distance Between Multiple Orbits and Generalized Fractal Dimensions"
.
Annales Henri Poincare
.
22
(6): 1853?1885.
arXiv
:
1912.07516
.
Bibcode
:
2021AnHP...22.1853B
.
doi
:
10.1007/s00023-021-01039-y
.
ISSN
1424-0661
.
S2CID
209376774
.
- ^
Franchini, Its & Korepin (2008)
- ^
Its & Korepin (2010)
- ^
RFC 4086, page 6
- ^
Bromiley, Thacker & Bouhova-Thacker (2004)
- ^
Beck & Schlogl (1993)
- ^
holds because
.
- ^
holds because
.
- ^
holds because
- ^
Devroye, Luc; Gyorfi, Laszlo; Lugosi, Gabor (1996-04-04).
A Probabilistic Theory of Pattern Recognition
(Corrected ed.). New York, NY: Springer.
ISBN
978-0-387-94618-4
.
- ^
a
b
Van Erven, Tim; Harremoes, Peter (2014). "Renyi Divergence and Kullback?Leibler Divergence".
IEEE Transactions on Information Theory
.
60
(7): 3797?3820.
arXiv
:
1206.2459
.
doi
:
10.1109/TIT.2014.2320500
.
S2CID
17522805
.
- ^
a
b
Soklakov (2018)
- ^
Nielsen & Nock (2011)
References
[
edit
]
- Beck, Christian; Schlogl, Friedrich (1993).
Thermodynamics of chaotic systems: an introduction
. Cambridge University Press.
ISBN
0521433673
.
- Jizba, P.; Arimitsu, T. (2004). "The world according to Renyi: Thermodynamics of multifractal systems".
Annals of Physics
.
312
(1): 17?59.
arXiv
:
cond-mat/0207707
.
Bibcode
:
2004AnPhy.312...17J
.
doi
:
10.1016/j.aop.2004.01.002
.
S2CID
119704502
.
- Jizba, P.; Arimitsu, T. (2004). "On observability of Renyi's entropy".
Physical Review E
.
69
(2): 026128.
arXiv
:
cond-mat/0307698
.
Bibcode
:
2004PhRvE..69b6128J
.
doi
:
10.1103/PhysRevE.69.026128
.
PMID
14995541
.
S2CID
39231939
.
- Bromiley, P.A.; Thacker, N.A.; Bouhova-Thacker, E. (2004),
Shannon Entropy, Renyi Entropy, and Information
,
CiteSeerX
10.1.1.330.9856
- Franchini, F.; Its, A. R.; Korepin, V. E. (2008). "Renyi entropy as a measure of entanglement in quantum spin chain".
Journal of Physics A: Mathematical and Theoretical
.
41
(25302): 025302.
arXiv
:
0707.2534
.
Bibcode
:
2008JPhA...41b5302F
.
doi
:
10.1088/1751-8113/41/2/025302
.
S2CID
119672750
.
- "Renyi test"
,
Encyclopedia of Mathematics
,
EMS Press
, 2001 [1994]
- Hero, A. O.; Michael, O.; Gorman, J. (2002).
Alpha-divergence for Classification, Indexing and Retrieval
(PDF)
(Technical Report CSPL-328). Communications and Signal Processing Laboratory, University of Michigan.
CiteSeerX
10.1.1.373.2763
.
- Its, A. R.; Korepin, V. E. (2010). "Generalized entropy of the Heisenberg spin chain".
Theoretical and Mathematical Physics
.
164
(3): 1136?1139.
Bibcode
:
2010TMP...164.1136I
.
doi
:
10.1007/s11232-010-0091-6
.
S2CID
119525704
.
- Nielsen, F.; Boltz, S. (2010). "The Burbea-Rao and Bhattacharyya centroids".
IEEE Transactions on Information Theory
.
57
(8): 5455?5466.
arXiv
:
1004.5049
.
doi
:
10.1109/TIT.2011.2159046
.
S2CID
14238708
.
- Nielsen, Frank; Nock, Richard (2012). "A closed-form expression for the Sharma?Mittal entropy of exponential families".
Journal of Physics A
.
45
(3): 032003.
arXiv
:
1112.4221
.
Bibcode
:
2012JPhA...45c2003N
.
doi
:
10.1088/1751-8113/45/3/032003
.
S2CID
8653096
.
- Nielsen, Frank; Nock, Richard (2011). "On Renyi and Tsallis entropies and divergences for exponential families".
Journal of Physics A
.
45
(3): 032003.
arXiv
:
1105.3259
.
Bibcode
:
2012JPhA...45c2003N
.
doi
:
10.1088/1751-8113/45/3/032003
.
S2CID
8653096
.
- Renyi, Alfred
(1961).
"On measures of information and entropy"
(PDF)
.
Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960
. pp. 547?561.
- Rosso, O. A. (2006). "EEG analysis using wavelet-based information tools".
Journal of Neuroscience Methods
.
153
(2): 163?182.
doi
:
10.1016/j.jneumeth.2005.10.009
.
PMID
16675027
.
S2CID
7134638
.
- Zachos, C. K. (2007). "A classical bound on quantum entropy".
Journal of Physics A
.
40
(21): F407?F412.
arXiv
:
hep-th/0609148
.
Bibcode
:
2007JPhA...40..407Z
.
doi
:
10.1088/1751-8113/40/21/F02
.
S2CID
1619604
.
- Nazarov, Y. (2011). "Flows of Renyi entropies".
Physical Review B
.
84
(10): 205437.
arXiv
:
1108.3537
.
Bibcode
:
2015PhRvB..91j4303A
.
doi
:
10.1103/PhysRevB.91.104303
.
S2CID
40312624
.
- Ansari, Mohammad H.; Nazarov, Yuli V. (2015). "Renyi entropy flows from quantum heat engines".
Physical Review B
.
91
(10): 104303.
arXiv
:
1408.3910
.
Bibcode
:
2015PhRvB..91j4303A
.
doi
:
10.1103/PhysRevB.91.104303
.
S2CID
40312624
.
- Ansari, Mohammad H.; Nazarov, Yuli V. (2015). "Exact correspondence between Renyi entropy flows and physical flows".
Physical Review B
.
91
(17): 174307.
arXiv
:
1502.08020
.
Bibcode
:
2015PhRvB..91q4307A
.
doi
:
10.1103/PhysRevB.91.174307
.
S2CID
36847902
.
- Soklakov, A. N. (2020).
"Economics of Disagreement?Financial Intuition for the Renyi Divergence"
.
Entropy
.
22
(8): 860.
arXiv
:
1811.08308
.
Bibcode
:
2020Entrp..22..860S
.
doi
:
10.3390/e22080860
.
PMC
7517462
.
PMID
33286632
.
- Ansari, Mohammad H.; van Steensel, Alwin; Nazarov, Yuli V. (2019).
"Entropy Production in Quantum is Different"
.
Entropy
.
21
(9): 854.
arXiv
:
1907.09241
.
doi
:
10.3390/e21090854
.
S2CID
198148019
.
- Rioul, Olivier (2021).
"This is IT: A Primer on Shannon's Entropy and Information"
(PDF)
.
Information Theory
. Progress in Mathematical Physics. Vol. 78. Birkhauser. pp. 49?86.
doi
:
10.1007/978-3-030-81480-9_2
.
ISBN
978-3-030-81479-3
.
S2CID
204783328
.