Ensemble Methods in Python
Learn how to build advanced and effective machine learning models in Python using ensemble techniques such as bagging, boosting, and stacking.
Commencer Le Cours Gratuitement4 heures15 vidéos52 exercices9 853 apprenantsDéclaration de réalisation
Créez votre compte gratuit
ou
En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données sont stockées aux États-Unis.Formation de 2 personnes ou plus ?
Essayer DataCamp for BusinessApprécié par les apprenants de milliers d'entreprises
Description du cours
Continue your machine learning journey by diving into the wonderful world of ensemble learning methods! These are an exciting class of machine learning techniques that combine multiple individual algorithms to boost performance and solve complex problems at scale across different industries. Ensemble techniques regularly win online machine learning competitions as well!
In this course, you’ll learn all about these advanced ensemble techniques, such as bagging, boosting, and stacking. You’ll apply them to real-world datasets using cutting edge Python machine learning libraries such as scikit-learn, XGBoost, CatBoost, and mlxtend.
Formation de 2 personnes ou plus ?
Donnez à votre équipe l’accès à la plateforme DataCamp complète, y compris toutes les fonctionnalités.Dans les titres suivants
Apprentissage automatique supervisé en Python
Aller à la piste- 1
Combining Multiple Models
GratuitDo you struggle to determine which of the models you built is the best for your problem? You should give up on that, and use them all instead! In this chapter, you'll learn how to combine multiple models into one using "Voting" and "Averaging". You'll use these to predict the ratings of apps on the Google Play Store, whether or not a Pokémon is legendary, and which characters are going to die in Game of Thrones!
Introduction to ensemble methods50 xpExploring Google apps data50 xpPredicting the rating of an app100 xpVoting50 xpChoosing the best model100 xpAssembling your first ensemble100 xpEvaluating your ensemble100 xpAveraging50 xpJourney to Westeros50 xpPredicting GoT deaths100 xpSoft vs. hard voting100 xp - 2
Bagging
Bagging is the ensemble method behind powerful machine learning algorithms such as random forests. In this chapter you'll learn the theory behind this technique and build your own bagging models using scikit-learn.
The strength of “weak” models50 xpRestricted and unrestricted decision trees100 xp"Weak" decision tree50 xpBootstrap aggregating50 xpTraining with bootstrapping100 xpA first attempt at bagging100 xpBaggingClassifier: nuts and bolts50 xpBagging: the scikit-learn way100 xpChecking the out-of-bag score100 xpBagging parameters: tips and tricks50 xpExploring the UCI SECOM data50 xpA more complex bagging model100 xpTuning bagging hyperparameters100 xp - 3
Boosting
Boosting is class of ensemble learning algorithms that includes award-winning models such as AdaBoost. In this chapter, you'll learn about this award-winning model, and use it to predict the revenue of award-winning movies! You'll also learn about gradient boosting algorithms such as CatBoost and XGBoost.
The effectiveness of gradual learning50 xpIntroducing the movie database50 xpExploring movie features50 xpPredicting movie revenue100 xpBoosting for predicted revenue100 xpAdaptive boosting: award winning model50 xpYour first AdaBoost model100 xpTree-based AdaBoost regression100 xpMaking the most of AdaBoost100 xpGradient boosting50 xpRevisiting Google app reviews50 xpSentiment analysis with GBM100 xpGradient boosting flavors50 xpMovie revenue prediction with CatBoost100 xpBoosting contest: Light vs Extreme100 xp - 4
Stacking
Get ready to see how things stack up! In this final chapter you'll learn about the stacking ensemble method. You'll learn how to implement it using scikit-learn as well as with the mlxtend library! You'll apply stacking to predict the edibility of North American mushrooms, and revisit the ratings of Google apps with this more advanced approach.
The intuition behind stacking50 xpExploring the mushroom dataset50 xpPredicting mushroom edibility100 xpK-nearest neighbors for mushrooms100 xpBuild your first stacked ensemble50 xpApplying stacking to predict app ratings100 xpBuilding the stacking classifier100 xpStacked predictions for app ratings100 xpLet's mlxtend it!50 xpA first attempt with mlxtend100 xpBack to regression with stacking100 xpMushrooms: a matter of life or death100 xpEnsembling it all together50 xp
Formation de 2 personnes ou plus ?
Donnez à votre équipe l’accès à la plateforme DataCamp complète, y compris toutes les fonctionnalités.Dans les titres suivants
Apprentissage automatique supervisé en Python
Aller à la pisteensembles de données
App ratingsApp reviewsGame of ThronesPokémonSECOM (Semiconductor Manufacturing)TMDb (The Movie Database)collaborateurs
Román de las Heras
Voir PlusData Scientist at Appodeal
Qu’est-ce que les autres apprenants ont à dire ?
Inscrivez-vous 15 millions d’apprenants et commencer Ensemble Methods in Python Aujourd’hui!
Créez votre compte gratuit
ou
En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données sont stockées aux États-Unis.