Amazon cover image
Image from Amazon.com

Ensemble Machine Learning Cookbook [electronic resource] : Over 35 Practical Recipes to Explore Ensemble Machine Learning Techniques Using Python.

By: Contributor(s): Material type: TextTextPublication details: Birmingham : Packt Publishing Ltd, 2019.Description: 1 online resource (327 p.)ISBN:
  • 1789132509
  • 9781789132502
Subject(s): Genre/Form: Additional physical formats: Print version:: Ensemble Machine Learning Cookbook : Over 35 Practical Recipes to Explore Ensemble Machine Learning Techniques Using PythonDDC classification:
  • 006.31 23
LOC classification:
  • Q325.5
Online resources:
Contents:
Cover; Title Page; Copyright and Credits; About Packt; Foreword; Contributors; Preface; Table of Contents; Chapter 1: Get Closer to Your Data; Introduction; Data manipulation with Python; Getting ready; How to do it...; How it works...; There's more...; See also; Analyzing, visualizing, and treating missing values; How to do it...; How it works...; There's more...; See also; Exploratory data analysis; How to do it...; How it works...; There's more...; See also; Chapter 2: Getting Started with Ensemble Machine Learning; Introduction to ensemble machine learning; Max-voting; Getting ready
How to do it...How it works...; There's more...; Averaging; Getting ready; How to do it...; How it works...; Weighted averaging; Getting ready; How to do it...; How it works...; See also; Chapter 3: Resampling Methods; Introduction to sampling; Getting ready; How to do it...; How it works...; There's more...; See also; k-fold and leave-one-out cross-validation; Getting ready; How to do it...; How it works...; There's more...; See also; Bootstrapping; Getting ready; How to do it...; How it works...; See also; Chapter 4: Statistical and Machine Learning Algorithms; Technical requirements
Multiple linear regressionGetting ready; How to do it...; How it works...; There's more...; See also; Logistic regression; Getting ready; How to do it...; How it works...; See also; Naive Bayes; Getting ready; How to do it...; How it works...; There's more...; See also; Decision trees; Getting ready; How to do it...; How it works...; There's more...; See also; Support vector machines; Getting ready; How to do it...; How it works...; There's more...; See also; Chapter 5: Bag the Models with Bagging; Introduction; Bootstrap aggregation; Getting ready; How to do it...; How it works...; See also
Ensemble meta-estimatorsBagging classifiers; How to do it...; How it works...; There's more...; See also; Bagging regressors; Getting ready; How to do it...; How it works...; See also; Chapter 6: When in Doubt, Use Random Forests; Introduction to random forests; Implementing a random forest for predicting credit card defaults using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; See also; Implementing random forest for predicting credit card defaults using H2O; Getting ready; How to do it...; How it works...; There's more...; See also
Chapter 7: Boosting Model Performance with BoostingIntroduction to boosting; Implementing AdaBoost for disease risk prediction using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; See also; Implementing a gradient boosting machine for disease risk prediction using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; Implementing the extreme gradient boosting method for glass identification using XGBoost with scikit-learn ; Getting ready...; How to do it...; How it works...; There's more...; See also; Chapter 8: Blend It with Stacking
Summary: This book uses a recipe-based approach to showcase the power of machine learning algorithms to build ensemble models using Python libraries. Through this book, you will be able to pick up the code, understand in depth how it works, execute and implement it efficiently. This will be a desk reference to implement a wide range of tasks and solve ...
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
eBook eBook e-Library EBSCO Computers Available
Total holds: 0

Description based upon print version of record.

Cover; Title Page; Copyright and Credits; About Packt; Foreword; Contributors; Preface; Table of Contents; Chapter 1: Get Closer to Your Data; Introduction; Data manipulation with Python; Getting ready; How to do it...; How it works...; There's more...; See also; Analyzing, visualizing, and treating missing values; How to do it...; How it works...; There's more...; See also; Exploratory data analysis; How to do it...; How it works...; There's more...; See also; Chapter 2: Getting Started with Ensemble Machine Learning; Introduction to ensemble machine learning; Max-voting; Getting ready

How to do it...How it works...; There's more...; Averaging; Getting ready; How to do it...; How it works...; Weighted averaging; Getting ready; How to do it...; How it works...; See also; Chapter 3: Resampling Methods; Introduction to sampling; Getting ready; How to do it...; How it works...; There's more...; See also; k-fold and leave-one-out cross-validation; Getting ready; How to do it...; How it works...; There's more...; See also; Bootstrapping; Getting ready; How to do it...; How it works...; See also; Chapter 4: Statistical and Machine Learning Algorithms; Technical requirements

Multiple linear regressionGetting ready; How to do it...; How it works...; There's more...; See also; Logistic regression; Getting ready; How to do it...; How it works...; See also; Naive Bayes; Getting ready; How to do it...; How it works...; There's more...; See also; Decision trees; Getting ready; How to do it...; How it works...; There's more...; See also; Support vector machines; Getting ready; How to do it...; How it works...; There's more...; See also; Chapter 5: Bag the Models with Bagging; Introduction; Bootstrap aggregation; Getting ready; How to do it...; How it works...; See also

Ensemble meta-estimatorsBagging classifiers; How to do it...; How it works...; There's more...; See also; Bagging regressors; Getting ready; How to do it...; How it works...; See also; Chapter 6: When in Doubt, Use Random Forests; Introduction to random forests; Implementing a random forest for predicting credit card defaults using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; See also; Implementing random forest for predicting credit card defaults using H2O; Getting ready; How to do it...; How it works...; There's more...; See also

Chapter 7: Boosting Model Performance with BoostingIntroduction to boosting; Implementing AdaBoost for disease risk prediction using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; See also; Implementing a gradient boosting machine for disease risk prediction using scikit-learn; Getting ready; How to do it...; How it works...; There's more...; Implementing the extreme gradient boosting method for glass identification using XGBoost with scikit-learn ; Getting ready...; How to do it...; How it works...; There's more...; See also; Chapter 8: Blend It with Stacking

Technical requirements

This book uses a recipe-based approach to showcase the power of machine learning algorithms to build ensemble models using Python libraries. Through this book, you will be able to pick up the code, understand in depth how it works, execute and implement it efficiently. This will be a desk reference to implement a wide range of tasks and solve ...

Master record variable field(s) change: 050, 072, 082, 650

Powered by Koha