This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon’s entropy function has a complementary dual function which we call “extropy”. The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution; and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments of the refinement of a distribution, the axiom which concerned Shannon and Jaynes. Their duality is specified via the relationship among the entropies and extropies of course and fine partitions. We also analyze the extropy function for densities, showing that relative extropy constitutes a dual to the Kullback-Leibler divergence, widely recognized as the continuous entropy measure. These results are unified within the general structure of Bregman divergences. In this context they identify half the L2 metric as the extropic dual to the entropic directed distance. We describe a statistical application to the scoring of sequential forecast distributions which provoked the discovery.

Lad, F., Sanfilippo, G., Agro, G. (2015). Extropy: complementary dual of entropy. STATISTICAL SCIENCE, 30(1), 40-58 [10.1214/14-STS430].

Extropy: complementary dual of entropy

SANFILIPPO, Giuseppe;AGRO', Gianna
2015-01-01

Abstract

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon’s entropy function has a complementary dual function which we call “extropy”. The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution; and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments of the refinement of a distribution, the axiom which concerned Shannon and Jaynes. Their duality is specified via the relationship among the entropies and extropies of course and fine partitions. We also analyze the extropy function for densities, showing that relative extropy constitutes a dual to the Kullback-Leibler divergence, widely recognized as the continuous entropy measure. These results are unified within the general structure of Bregman divergences. In this context they identify half the L2 metric as the extropic dual to the entropic directed distance. We describe a statistical application to the scoring of sequential forecast distributions which provoked the discovery.
2015
Settore MAT/06 - Probabilita' E Statistica Matematica
Settore SECS-S/01 - Statistica
Settore ING-INF/05 - Sistemi Di Elaborazione Delle Informazioni
Lad, F., Sanfilippo, G., Agro, G. (2015). Extropy: complementary dual of entropy. STATISTICAL SCIENCE, 30(1), 40-58 [10.1214/14-STS430].
File in questo prodotto:
File Dimensione Formato  
STS430.pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: Versione Editoriale
Dimensione 2.16 MB
Formato Adobe PDF
2.16 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/97512
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 177
  • ???jsp.display-item.citation.isi??? 167
social impact