Concentration of Measure Inequalities in Information Theory, Communications, and Coding
Author(s):
Source:
Journal:Foundations and Trends® in Communications and Information Theory
ISSN Print:1567-2190, ISSN Online:1567-2328
Publisher:Now Publishers
Volume 10 Number 1-2,
Pages: 250(1-247)
DOI: 10.1561/0100000064
Abstract:
Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied
and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical
statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation),
information theory, theoretical computer science, learning theory, and dynamical systems.
This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on
their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this
monograph also includes various new recent results derived by the authors.
The first part of the monograph introduces classical concentration inequalities for martingales, as
well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes
defined on graphs and iterative decoding algorithms, as well as codes for wireless communication.
The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities
for functions of many independent random variables. The basic ingredients of the entropy method are discussed first in conjunction with the
closely related topic of logarithmic Sobolev inequalities, which are typical of the so-called functional approach to studying the
concentration of measure phenomenon. The discussion on logarithmic Sobolev inequalities is complemented by a related viewpoint based on
probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information
theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections
to the entropy method. Finally, we discuss several applications of the entropy method and related information-theoretic tools to problems in
communications and coding. These include strong converses, empirical distributions of good channel codes with non-vanishing error probability,
and an information-theoretic converse for concentration of measure.