Ensemble Learning for AI Developers: Learn Bagging, Stacking, and Boosting Methods with Use Cases - Softcover

Kumar, Alok; Jain, Mayank

 
9781484259412: Ensemble Learning for AI Developers: Learn Bagging, Stacking, and Boosting Methods with Use Cases

This specific ISBN edition is currently not available.

Synopsis

Chapter 1: An Introduction to Ensemble Learning

Chapter Goal: This chapter will give you a brief overview of ensemble learning
No of pages - 10
Sub -Topics
 Need for ensemble techniques in machine learning
 Historical overview of ensemble learning
 A brief overview of various ensemble techniques
Chapter 2: Varying Training Data
Chapter Goal: In this chapter we will talk in detail about ensemble techniques where training
data is changed.
No of pages: 30
Sub - Topics:
 Use of bagging or bootstrap aggregating for making ensemble model
 Code samples
 Popular libraries support for bagging and best practices
 Introduction to random forests models
 Hands-on code examples for using random forest models
 Introduction to cross validation methods in machine learning
 Intro to K-Fold cross validation ensembles with code samples
 Other examples of varying data ensemble techniques

Chapter 3: Varying Combinations
Chapter Goal : In this chapter we will talk about in detail about techniques where models are
used in combination with one another to getting an ensemble learning boost.
No of pages: 40
Sub - Topics:
 Boosting : We will talk in detail about various boosting techniques with historical
 examples
 Introduction to adaboost , with code examples , Industry best practices and useful state
 of the art libraries for adaboost
 Introduction to gradient boosting , with hands on code examples with useful libraries
 and industry best practices for gradient boosting
 Introduction to XGboost with hands on code examples with useful libraries and industry
 best practices for XGboost
 Stacking : We will talk in detail about various stacking techniques are used in machine
 learning world
 Stacking in practice: How stacking is used by Kagglers for improving for winning
 entries.

Chapter 4: Varying Models
Chapter Goal: In this chapter we will talk about how ensemble learning models could
lead to better performance of your machine learning project
No of pages: 30
Sub - Topics:
 Training multiple model ensembles with code examples
 Hyperparameter tuning ensembles with code examples
 Horizontal voting ensembles
 Snapshot ensembles and its variants, Introduction to the cyclic learning rate.
 Code examples
 Use of ensembles in the deep learning world.

Chapter 5: Ensemble Learning Libraries and How to Use Them
Chapter Goal: In this chapter we will go into details about some very popular libraries used by
data science practitioners and Kagglers for ensemble learning
No of pages: 25
Sub - Topics:
 Ensembles in Scikit-Learn
 Learning how to use ensembles in TensorFlow
 Implementing and using ensembles in PyTorch
 Using Boosting using Microsoft LightGBM
 Boosting using XGBoost
 Stacking using H2O library
 Ensembles in R

Chapter 6: Tips and Best Practices
Chapter Goal: In this chapter we will learn what are the best practices around ensemble learning with real world examples
No of pages: 25
Sub - Topics:
 How to build a state of the art Image classifier using ensembles
 How to use ensembles in NLP with real-world examples
 Use of ensembles for structured data analysis
 Using ensembles for time series data
 Useful tips and pitfalls
 How to leverage ense

"synopsis" may belong to another edition of this title.

Other Popular Editions of the Same Title

9781484259399: Ensemble Learning for AI Developers: Learn Bagging, Stacking, and Boosting Methods with Use Cases

Featured Edition

ISBN 10:  1484259394 ISBN 13:  9781484259399
Publisher: Apress, 2020
Softcover