This work covers the philosophy of model-based data analysis and provides an omnibus strategy for the analysis of empirical data. It introduces information theoretical approaches and focuses critical attention on a priori modelling and the selection of a good approximating model that best represents the inference supported by the data. Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. The information theoretic approaches seek to provide a unified theory, an extension of likelihood theory. The work brings model selection and parameter estimation under a common framework - optimization. The value of AIC is computed for each a priori model to be considered and the model with the minimum AIC is used for statistical inference.
However, the paradigm described in the book goes beyond the computation and interpretation of AIC to select a parsimonious model for inference from empirical data; it refocuses increased attention on a variety of considerations and modelling prior to the actual analysis of data.
- ISBN10 0387985042
- ISBN13 9780387985046
- Publish Date 30 November 1998
- Publish Status Out of Print
- Out of Print 22 March 2008
- Publish Country US
- Imprint Springer-Verlag New York Inc.
- Format Hardcover
- Pages 320
- Language English