Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New.
Seller: Ria Christie Collections, Uxbridge, United Kingdom
£ 187.59
Quantity: Over 20 available
Add to basketCondition: New. In.
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
£ 186.46
Quantity: Over 20 available
Add to basketCondition: New.
Seller: preigu, Osnabrück, Germany
Taschenbuch. Condition: Neu. The Nature of Statistical Learning Theory | Vladimir Vapnik | Taschenbuch | Information Science and Statistics | xx | Englisch | 2010 | Humana | EAN 9781441931603 | Verantwortliche Person für die EU: Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg, juergen[dot]hartmann[at]springer[dot]com | Anbieter: preigu.
Seller: AHA-BUCH GmbH, Einbeck, Germany
Taschenbuch. Condition: Neu. Druck auf Anfrage Neuware - Printed after ordering - The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: \* the setting of learning problems based on the model of minimizing the risk functional from empirical data \* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency \* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle \* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds \* the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: \* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation \* a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of.
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
£ 293.99
Quantity: Over 20 available
Add to basketCondition: As New. Unread book in perfect condition.
Seller: Mispah books, Redhill, SURRE, United Kingdom
Paperback. Condition: Like New. Like New. book.
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: As New. Unread book in perfect condition.
Condition: new. Questo è un articolo print on demand.
Seller: moluna, Greven, Germany
Condition: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicia.
Language: English
Published by Springer, Humana Dez 2010, 2010
ISBN 10: 1441931600 ISBN 13: 9781441931603
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: \* the setting of learning problems based on the model of minimizing the risk functional from empirical data \* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency \* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle \* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds \* the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: \* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation \* a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of 336 pp. Englisch.
Language: English
Published by Springer, Humana Dez 2010, 2010
ISBN 10: 1441931600 ISBN 13: 9781441931603
Seller: buchversandmimpf2000, Emtmannsberg, BAYE, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - Print on Demand Titel. Neuware -The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicians, physicists, and computer scientists.Springer-Verlag KG, Sachsenplatz 4-6, 1201 Wien 336 pp. Englisch.