This book provides a basic introduction to both information theory and data compression. Although the two topics are related, this unique treatment allows readers to explore either topic independently. The authors' presentation of information theory is pitched at an elementary level, making the book less daunting than most other texts. The second edition includes a detailed history of information theory that provides a solid background for the quantification of the topic as developed by Claude Shannon. It also covers the information rate of a code and the trade-off between error correction and rate of information transmission, probabilistic finite state source automata, and wavelet methods.

An effective blend of carefully explained theory and practical applications, this text imparts the fundamentals of both information theory and data compression. Although the two topics are related, this unique text allows either topic to be presented independently, and it was specifically designed so that the data compression section requires no prior knowledge of information theory.

The treatment of information theory, while theoretical and abstract, is quite elementary, making this text less daunting than many others. After presenting the fundamental definitions and results of the theory, the authors then apply the theory to memoryless, discrete channels with zeroth-order, one-state sources.

The chapters on data compression acquaint students with a myriad of lossless compression methods and then introduce two lossy compression methods. Students emerge from this study competent in a wide range of techniques. The authors' presentation is highly practical but includes some important proofs, either in the text or in the exercises, so instructors can, if they choose, place more emphasis on the mathematics.

Introduction to Information Theory and Data Compression, Second Edition is ideally suited for an upper-level or graduate course for students in mathematics, engineering, and computer science.

Features:

  • Expanded discussion of the historical and theoretical basis of information theory that builds a firm, intuitive grasp of the subject
  • Reorganization of theoretical results along with new exercises, ranging from the routine to the more difficult, that reinforce students' ability to apply the definitions and results in specific situations.
  • Simplified treatment of the algorithm(s) of Gallager and Knuth
  • Discussion of the information rate of a code and the trade-off between error correction and information rate
  • Treatment of probabilistic finite state source automata, including basic resul