Foundations and Trends (R) in Communications and Information Theory
2 total works
Random Matrix Theory and Wireless Communications
by Antonia Tulino and Sergio Verdu
Published 1 January 2004
Random matrix theory has found many applications in physics, statistics and engineering since its inception. Although early developments were motivated by practical experimental problems, random matrices are now used in fields as diverse as Riemann hypothesis, stochastic differential equations, condensed matter physics, statistical physics, chaotic systems, numerical linear algebra, neural networks, multivariate statistics, information theory, signal processing and small-world networks.
This is the first tutorial on random matrices which provides an overview of the theory and brings together in one source the most significant results recently obtained. Furthermore, the application of random matrix theory to the fundamental limits of wireless communication channels is described in depth. The authors have created a uniquely comprehensive work that provides the reader with a full understanding of the foundations of random matrix theory and demonstrates the trends of their applications, particularly in wireless communications.
Random Matrix Theory and Wireless Communications is a valuable resource for all students and researchers working on the cutting edge of wireless communications.
This is the first tutorial on random matrices which provides an overview of the theory and brings together in one source the most significant results recently obtained. Furthermore, the application of random matrix theory to the fundamental limits of wireless communication channels is described in depth. The authors have created a uniquely comprehensive work that provides the reader with a full understanding of the foundations of random matrix theory and demonstrates the trends of their applications, particularly in wireless communications.
Random Matrix Theory and Wireless Communications is a valuable resource for all students and researchers working on the cutting edge of wireless communications.
Universal Estimation of Information Measures for Analog Sources
by Qing Wang, Sanjeev Kulkarni, and Sergio Verdu
Published 27 May 2009
Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures.
Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.
Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.
Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.
Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.