Items related to Learning from Data: Concepts, Theory, and Methods (Adaptive...

Learning from Data: Concepts, Theory, and Methods (Adaptive and Cognitive Dynamic Systems: Signal Processing, Learning, Communications and Control) - Hardcover

 
9780471154938: Learning from Data: Concepts, Theory, and Methods (Adaptive and Cognitive Dynamic Systems: Signal Processing, Learning, Communications and Control)

Synopsis

An interdisciplinary framework for learning methodologies-covering statistics, neural networks, and fuzzy logic This book provides a unified treatment of the principles and methods for learning dependencies from data. It establishes a general conceptual framework in which various learning methods from statistics, neural networks, and fuzzy logic can be applied-showing that a few fundamental principles underlie most new methods being proposed today in statistics, engineering, and computer science. Complete with over one hundred illustrations, case studies, and examples, Learning from Data: Relates statistical formulation with the latest methodologies used in artificial neural networks, fuzzy systems, and wavelets Features consistent terminology, chapter summaries, and practical research tips Emphasizes the conceptual framework provided by Statistical Learning Theory (VC-theory) rather than its commonly practiced mathematical aspects Provides a detailed description of the new learning methodology called Support Vector Machines (SVM) This invaluable text/reference accommodates both beginning and advanced graduate students in engineering, computer science, and statistics. It is also indispensable for researchers and practitioners in these areas who must understand the principles and methods for learning dependencies from data.

"synopsis" may belong to another edition of this title.

Review

"...contains considerable information on the concept of statistical learning theory.... However, some may find its presentation difficult to follow..." -- Technometrics, February, 2001

"...well readable..." -- Zentralblatt Math, Vol.960, No.10 2001

From the Author

Detailed Table of Contents


1 Introduction

........1.1 Learning and Statistical Estimation
........1.2 Statistical Dependency and Causality
........1.3 Characterization of Variables
........1.4 Characterization of Uncertainty
........References

2 Problem Statement, Classical Approaches, and Adaptive Learning

........2.1 Formulation of the Learning Problem
................2.1.1 Role of the Learning Machine
................2.1.2 Common Learning Tasks
................2.1.3 Scope of the Learning Problem Formulation
........2.2 Classical Approaches
................2.2.1 Density Estimation
................2.2.2 Classification (Discriminant Analysis)
................2.2.3 Regression
................2.2.4 Stochastic Approximation
................2.2.5 Solving Problems with Finite Data
................2.2.6 Nonparametric Methods
........2.3 Adaptive Learning: Concepts and Inductive Principles
................2.3.1 Philosophy, Major Concepts, and Issues
................2.3.2 A priori Knowledge and Model Complexity
................2.3.3 Inductive Principles
........2.4 Summary
........References

3 Regularization Framework

........3.1 Curse and Complexity of Dimensionality
........3.2 Function Approx. and Characterization of Complexity
........3.3 Penalization
................3.3.1 Parametric Penalties
................3.3.2 Nonparametric Penalties
........3.4 Model Selection (Complexity Control)
................3.4.1 Analytical Model Selection Criteria
................3.4.2 Model Selection via Resampling
................3.4.3 Bias-variance Trade-off
................3.4.4 Example of Model Selection
........3.5 Summary
........References

4 Statistical Learning Theory

........4.1 Conditions for Consistency and Convergence of ERM
........4.2 Growth Function and VC-Dimension
................4.2.1 VC-Dimension of the Set of Real-Valued Functions
................4.2.2 VC-Dim. for Classification and Regression Problems
................4.2.3 Examples of Calculating VC-Dimension
........4.3 Bounds on the Generalization
................4.3.1 Classification
................4.3.2 Regression
................4.3.3 Generalization Bounds and Sampling Theorem
........4.4 Structural Risk Minimization
........4.5 Case Study: Comparison of Methods for Model Selection
........4.6 Summary
........References

5 Nonlinear Optimization Strategies

........5.1 Stochastic Approximation Methods
................5.1.1 Linear Parameter Estimation
................5.1.2 Backpropagation Training of MLP Networks
........5.2 Iterative Methods
................5.2.1 Expectation-Maximization Methods for Density Est.
................5.2.2 Generalized Inverse Training of MLP Networks
........5.3 Greedy Optimization
................5.3.1 Neural Network Construction Algorithms
................5.3.2 Classification and Regression Trees (CART)
........5.4 Feature Selection, Optimization, and Stat. Learning Th.
........5.5 Summary
........References

6 Methods for Data Reduction and Dim. Reduction

........6.1 Vector Quantization
................6.1.1 Optimal Source Coding in Vector Quantization
................6.1.2 Generalized Lloyd Algorithm
................6.1.3 Clustering and Vector Quantization
................6.1.4 EM Algorithm for VQ and Clustering
........6.2 Dimensionality Reduction: Statistical Methods
................6.2.1 Linear Principal Components
................6.2.2 Principal Curves and Surfaces
........6.3 Dimensionality Reduction: Neural Network Methods
................6.3.1 Discrete Principal Curves and Self-org. Map Alg.
................6.3.2 Statistical Interpretation of the SOM Method
................6.3.3 Flow-through Version of the SOM and Learning Rate Schedules
................6.3.4 SOM Applications and Modifications
................6.3.5 Self-supervised MLP
........6.4 Summary
........References

7 Methods for Regression

........7.1 Taxonomy: Dictionary versus Kernel Representation
........7.2 Linear Estimators
................7.2.1 Estimation of Linear Models and Equivalence of Representations
................7.2.2 Analytic Form of Cross-validation
................7.2.3 Estimating Complexity of Penalized Linear Models
........7.3 Nonadaptive Methods
................7.3.1 Local Polynomial Estimators and Splines
................7.3.2 Radial Basis Function Networks
................7.3.3 Orthogonal Basis Functions and Wavelets
........7.4 Adaptive Dictionary Methods
................7.4.1 Additive Methods and Projection Pursuit Regression
................7.4.2 Multilayer Perceptrons and Backpropagation
................7.4.3 Multivariate Adaptive Regression Splines
........7.5 Adaptive Kernel Methods and Local Risk Minimization
................7.5.1 Generalized Memory-Based Learning
................7.5.2 Constrained Topological Mapping
........7.6 Empirical Comparisons
................7.6.1 Experimental Setup
................7.6.2 Summary of Experimental Results
........7.7 Combining Predictive Models
........7.8 Summary
........References

8 Classification

........8.1 Statistical Learning Theory formulation
........8.2 Classical Formulation
........8.3 Methods for Classification
................8.3.1 Regression-Based Methods
................8.3.2 Tree-Based Methods
................8.3.3 Nearest Neighbor and Prototype Methods
................8.3.4 Empirical Comparisons
........8.4 Summary
........References

9 Support Vector Machines

........9.1 Optimal Separating Hyperplanes
........9.2 High Dimensional Mapping and Inner Product Kernels
........9.3 Support Vector Machine for Classification
........9.4 Support Vector Machine for Regression
........9.5 Summary
........References

10 Fuzzy Systems

........10.1 Terminology, Fuzzy Sets, and Operations
........10.2 Fuzzy Inference Systems and Neurofuzzy Systems
................10.2.1 Fuzzy Inference Systems
................10.2.2 Equivalent Basis Function Representation
................10.2.3 Learning Fuzzy Rules from Data
........10.3 Applications in Pattern Recognition
................10.3.1 Fuzzy Input Encoding and Fuzzy Postprocessing
................10.3.2 Fuzzy Clustering
........10.4 Summary
........References

Appendix A: Review of Nonlinear Optimization

Appendix B: Eigenvalues and Singular Value Decomposition

Index

"About this title" may belong to another edition of this title.

  • PublisherWiley–Blackwell
  • Publication date1998
  • ISBN 10 0471154938
  • ISBN 13 9780471154938
  • BindingHardcover
  • LanguageEnglish
  • Number of pages464

Buy Used

Condition: Very Good
May have limited writing in cover...
View this item

FREE shipping within U.S.A.

Destination, rates & speeds

Search results for Learning from Data: Concepts, Theory, and Methods (Adaptive...

Stock Image

Cherkassky, Vladimir; Mulier, Filip M.
Published by Wiley-Interscience, 1998
ISBN 10: 0471154938 ISBN 13: 9780471154938
Used Hardcover

Seller: ThriftBooks-Dallas, Dallas, TX, U.S.A.

Seller rating 5 out of 5 stars 5-star rating, Learn more about seller ratings

Hardcover. Condition: Very Good. No Jacket. May have limited writing in cover pages. Pages are unmarked. ~ ThriftBooks: Read More, Spend Less 1.7. Seller Inventory # G0471154938I4N00

Contact seller

Buy Used

£ 21.36
Convert currency
Shipping: FREE
Within U.S.A.
Destination, rates & speeds

Quantity: 1 available

Add to basket

Stock Image

Cherkassky, Vladimir
Published by John Wiley and Sons, 1998
ISBN 10: 0471154938 ISBN 13: 9780471154938
Used Hardcover

Seller: Anybook.com, Lincoln, United Kingdom

Seller rating 5 out of 5 stars 5-star rating, Learn more about seller ratings

Condition: Poor. This is an ex-library book and may have the usual library/used-book markings inside.This book has hardback covers. Book contains pencil markings. In poor condition, suitable as a reading copy. Please note the Image in this listing is a stock photo and may not match the covers of the actual item,850grams, ISBN:9780471154938. Seller Inventory # 8676633

Contact seller

Buy Used

£ 15.49
Convert currency
Shipping: £ 11.88
From United Kingdom to U.S.A.
Destination, rates & speeds

Quantity: 1 available

Add to basket