|
|
|
|
|
Information Theory and Statistics: A Tutorial
Author(s):
Source: Journal:Foundations and Trends® in Communications and Information Theory ISSN Print:1567-2190, ISSN Online:1567-2328 Publisher:Now Publishers Volume 1 Number 4, Pages: 112(417-528) DOI: 10.1561/0100000004
Abstract:
This tutorial is concerned with applications of information theory
concepts in statistics, in the finite alphabet setting. The
information measure known as information divergence or
Kullback-Leibler distance or relative entropy plays a key role,
often with a geometric flavor as an analogue of squared Euclidean
distance, as in the concepts of I-projection, I-radius and
I-centroid. The topics covered include large deviations, hypothesis
testing, maximum likelihood estimation in exponential families,
analysis of contingency tables, and iterative algorithms with an
“information geometry” background. Also, an
introduction is provided to the theory of universal coding, and to
statistical inference via the minimum description length principle
motivated by that theory.
|
|
|
|