The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis.
For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr.
Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite scheme, and discusses a simple application to coding theory.
The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.
Partial Contents: I.
The Entropy Concept in Probability Theory -Entropy of Finite Schemes.