From
Grand Eagle Retail, Bensenville, IL, U.S.A.
Seller rating 5 out of 5 stars
AbeBooks Seller since 12 October 2005
Paperback. The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarzs Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach. This brilliantly structured and comprehensive volume provides exhaustive explanations of the concepts and philosophy of statistical modeling, together with a wide range of practical and numerical examples. Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Seller Inventory # 9781441924568
Statistical modeling is a critical tool in scientific research. Statistical models are used to understand phenomena with uncertainty and to determine the structure of complex systems and to control such systems, as well as to make reliable predictions in various natural and social science fields. This book provides comprehensive explanations of the concepts and philosophy of statistical modeling, together with a wide range of practical and numerical examples. We hope that this book will be of great value not just to statisticians but also to researchers and practitioners in various fields of research such as information science, computer science, engineering, bioinformatics, economics, marketing and environmental science.
From the Back Cover:
Winner of the 2009 Japan Statistical Association Publication Prize.
The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.
One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz’s Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.
Sadanori Konishi is Professor of Faculty of Mathematics at Kyushu University. His primary research interests are in multivariate analysis, statistical learning, pattern recognition and nonlinear statistical modeling. He is the editor of the Bulletin of Informatics and Cybernetics and is co-author of several Japanese books. He was awarded the Japan Statistical Society Prize in 2004 and is a Fellow of the American Statistical Association.
Genshiro Kitagawa is Director-General of the Institute of Statistical Mathematics and Professor of Statistical Science at the Graduate University for Advanced Study. His primary interests are in time series analysis, non-Gaussian nonlinear filtering and statistical modeling. He is the executive editor of the Annals of theInstitute of Statistical Mathematics, co-author of Smoothness Priors Analysis of Time Series, Akaike Information Criterion Statistics, and several Japanese books. He was awarded the Japan Statistical Society Prize in 1997 and Ishikawa Prize in 1999, and is a Fellow of the American Statistical Association.
Title: Information Criteria and Statistical ...
Publisher: Springer-Verlag New York Inc., New York, NY
Publication Date: 2010
Binding: Paperback
Condition: new
Edition: 1st Edition
Seller: moluna, Greven, Germany
Condition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. With the development of modeling techniques, it has been required to construct model selection criteria, relaxing the assumptions imposed AIC and BICStatistical modeling is a critical tool in scientific research. This book provides comprehensi. Seller Inventory # 4172970
Quantity: Over 20 available
Seller: Lucky's Textbooks, Dallas, TX, U.S.A.
Condition: New. Seller Inventory # ABLIING23Mar2411530294171
Quantity: Over 20 available
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New. Seller Inventory # 12688350-n
Quantity: Over 20 available
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
Condition: New. Seller Inventory # 12688350-n
Quantity: Over 20 available
Seller: Ria Christie Collections, Uxbridge, United Kingdom
Condition: New. In. Seller Inventory # ria9781441924568_new
Quantity: Over 20 available
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz's Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach. 288 pp. Englisch. Seller Inventory # 9781441924568
Quantity: 2 available
Seller: buchversandmimpf2000, Emtmannsberg, BAYE, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - Print on Demand Titel. Neuware -The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz¿s Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 288 pp. Englisch. Seller Inventory # 9781441924568
Quantity: 1 available
Seller: AHA-BUCH GmbH, Einbeck, Germany
Taschenbuch. Condition: Neu. Druck auf Anfrage Neuware - Printed after ordering - The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz's Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach. Seller Inventory # 9781441924568
Quantity: 1 available
Seller: California Books, Miami, FL, U.S.A.
Condition: New. Seller Inventory # I-9781441924568
Quantity: Over 20 available
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: As New. Unread book in perfect condition. Seller Inventory # 12688350
Quantity: Over 20 available