Foundations and Trends (R) in Communications and Information Theory
1 total work
Information Combining is an introduction to the principles of information combining. The concept is described, the bounds for repetition codes and for single parity-check codes are proved, and some applications are provided. As the focus is on the basic principles, it considers a binary symmetric source, binary linear channel codes, and binary-input symmetric memoryless channels.
The authors first introduce the concept of mutual information profiles and revisit the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples.
This is an excellent tutorial on this important subject for students, researchers and professonals working in communications and information theory.
The authors first introduce the concept of mutual information profiles and revisit the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples.
This is an excellent tutorial on this important subject for students, researchers and professonals working in communications and information theory.