Information Distances and Divergences for the Generalized Normal Distribution
Information Distances and Divergences for the Generalized Normal Distribution | Chapter 02 | Advances in Mathematics and Computer Science Vol. 3
The study of relative measures of information between two distributions that characterizes anInput/Output System is important for the investigation of the informational ability and behaviourof that system. The most important measures of information distance and divergence are brieflypresented and grouped. In Statistical Geometry, and for the study of statistical manifolds, relativemeasures of information are needed that are also distance metrics. The Hellinger distance metric isstudied, providing a “compact” measure of informational “proximity” between of two distributions.Certain formulations of the Hellinger distance between two generalized normal distributions aregiven and discussed. Some results for the Bhattacharyya distance are also given. Moreover, thesymmetricity of the Kullback-Leibler divergence between a generalized normal and at-distribution,is examined for this key measure of information divergence.
Author(s) Details
Thomas L. Toulias
University of West Attica, Ag. Spyridonos Str. 28 (Campus 1), 12243 Egaleo, Athens, Greece.
Christos P. Kitsos
University of West Attica, Ag. Spyridonos Str. 28 (Campus 1), 12243 Egaleo, Athens, Greece.
Read full article: http://bp.bookpi.org/index.php/bpi/catalog/view/81/1127/802-1