Information Science and Statistics
1 total work
This bookisan outgrowthoften yearsof researchatthe Universityof Florida Computational NeuroEngineering Laboratory (CNEL) in the general area of statistical signal processing and machine learning. One of the goals of writing the book is exactly to bridge the two ?elds that share so many common problems and techniques but are not yet e?ectively collaborating. Unlikeotherbooks thatcoverthe state ofthe artinagiven?eld,this book cuts across engineering (signal processing) and statistics (machine learning) withacommontheme:learningseenfromthepointofviewofinformationt- orywithanemphasisonRenyi'sde?nitionofinformation.Thebasicapproach is to utilize the information theory descriptors of entropy and divergence as nonparametric cost functions for the design of adaptive systems in unsup- vised or supervised training modes. Hence the title: Information-Theoretic Learning (ITL). In the course of these studies, we discovered that the main idea enabling a synergistic view as well as algorithmic implementations, does not involve the conventional central moments of the data (mean and covariance). Rather, the core concept is the ?-norm of the PDF, in part- ular its expected value (?
= 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.
= 2), which we call the information potential. This operator and related nonparametric estimators link information theory, optimization of adaptive systems, and reproducing kernel Hilbert spaces in a simple and unconventional way.