184 pages, Figs, tabs
The abstract concept of "information" can be quantified and this has led to many important advances in the analysis of data in the empirical sciences. This text focuses on a science philosophy based on "multiple working hypotheses" and statistical models to represent them. The fundamental science question relates to the empirical evidence for hypotheses in this set-a formal strength of evidence. Kullback-Leibler information is the information lost when a model is used to approximate full reality. Hirotugu Akaike found a link between K-L information (a cornerstone of information theory) and the maximized log-likelihood (a cornerstone of mathematical statistics). This combination has become the basis for a new paradigm in model based inference. The text advocates formal inference from all the hypotheses/models in the a priori set-multimodel inference.
This compelling approach allows a simple ranking of the science hypothesis and their models. Simple methods are introduced for computing the likelihood of model i, given the data; the probability of model i, given the data; and evidence ratios. These quantities represent a formal strength of evidence and are easy to compute and understand, given the estimated model parameters and associated quantities (e.g., residual sum of squares, maximized log-likelihood, and covariance matrices). Additional forms of multimodel inference include model averaging, unconditional variances, and ways to rank the relative importance of predictor variables.
From the reviews: ".! The writing style is pragmatic and appropriate for someone without advanced statistical training. Readers looking to recommend a book on information-criteria-based modeling to colleagues who are not statisticians, or looking to locate such a book for their libraries are likely to be satisfied with this book. " (Biometrics, December 2008, Brief Reports by the Editor) "This ! book provides an introduction to this approach of evidence-based inference. It is focused on advocating and teaching the approach. It includes some history and philosophy with the methods, and each chapter ends with exercises. ! For those who are already familiar with model-based inference ! it provides a more in-depth account of the information theoretical approach. For those who are new to model-based inference, it provides a good conceptual and technical introduction." (Glenn Suter, Integrated Environmental Assessment and Management, Vol. 5 (2), 2009) "Readership: Researchers and graduate students in ecology and other life sciences. This monograph expounds ideas that the author has developed over many years with Burnham. It is heavily example-based, and aimed at working scientists. Examples are predominately from ecological studies. ! This is an interesting and challenging ! book." (John H. Maindonald, International Statistical Review, Vol. 77 (3), 2009) "!Presents an information-theoretic approach to statistical inference!Well motivated, clearly written, and thought provoking for its targeted readership. !" (The American Statistician, February 2010, Vol. 64, No. 1)
Introduction--science hypotheses and science philosophy.- Data and models.- Information theory and entropy.- Quantifying the evidence about science hypotheses.- Multimodel inference.- Advanced topics.- Summary.
There are currently no reviews for this product. Be the first to review this product!