Elements of Information Theory – Thomas M. Cover, Joy A. Thomas – 2nd Edition


The latest edition of this classic is updated with new problem sets and material

The Second Edition of this fundamental textbook maintains the ’s tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, , statistics, and information theory.

All the essential topics in information theory are covered in detail, , data compression, channel capacity, rate distortion, information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.

The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new
* New material on source coding, portfolio theory, and feedback capacity
* Updated references

Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in , statistics, and telecommunications.

Table of Content

1. Introduction and Preview.
2. Entropy, Relative Entropy, and Mutual Information.
3. Asymptotic Equipartition Property.
4. Entropy Rates of a Stochastic Process.
5. Data Compression.
6. Gambling and Data Compression.
7. Channel Capacity.
8. Differential Entropy.
9. Gaussian Channel.
10. Rate Distortion Theory.
11. Information Theory and Statistics.
12. Maximum Entropy.
13. Universal Source Coding.
14. Kolmogorov Complexity.
15. Network Information Theory.
16. Information Theory and Portfolio Theory.
17. Inequalities in Information Theory.

No Comments

  • Feedback: Leave your comments here!

    Your opinions and comments would be greatly appreciated.
    If you have comments or questions we've added this section so that we might have a dialogue with you.

Complete all fields

sixteen − twelve =