|
|
|
|
|
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
Author(s): Vincent Y. F. Tan
Source: Journal:Foundations and Trends® in Communications and Information Theory ISSN Print:1567-2190, ISSN Online:1567-2328 Publisher:Now Publishers Volume 11 Number 1-2, Pages: 188(1-184) DOI: 10.1561/0100000086
Abstract:
This monograph presents a unified treatment of single- and multi-user problems in Shannon’s information theory where we depart from the
requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are
bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths.
This represents the study of asymptotic estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss Strassen’s seminal result for binary hypothesis testing
where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of
independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even
third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems
for which the second order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels.
Finally, we discuss avenues for further research.
|
|
|
|