Machine Learning with Tree-Based Models in R
Learn how to use tree-based models and ensembles to make classification and regression predictions with tidymodels.
Comece O Curso Gratuitamente4 horas16 vídeos58 exercícios7.994 aprendizesDeclaração de Realização
Crie sua conta gratuita
ou
Ao continuar, você aceita nossos Termos de Uso, nossa Política de Privacidade e que seus dados são armazenados nos EUA.Treinar 2 ou mais pessoas?
Tentar DataCamp for BusinessAmado por alunos de milhares de empresas
Descrição do Curso
Tree-based machine learning models can reveal complex non-linear relationships in data and often dominate machine learning competitions. In this course, you'll use the tidymodels package to explore and build different tree-based models—from simple decision trees to complex random forests. You’ll also learn to use boosted trees, a powerful machine learning technique that uses ensemble learning to build high-performing predictive models. Along the way, you'll work with health and credit risk data to predict the incidence of diabetes and customer churn.
Treinar 2 ou mais pessoas?
Obtenha acesso à sua equipe à plataforma DataCamp completa, incluindo todos os recursos.Nas seguintes faixas
Fundamentos de aprendizado de máquina em R
Ir para a trilhaCientista de aprendizado de máquina in R
Ir para a trilhaAprendizado de máquina supervisionado em R
Ir para a trilha- 1
Classification Trees
GratuitoReady to build a real machine learning pipeline? Complete step-by-step exercises to learn how to create decision trees, split your data, and predict which patients are most likely to suffer from diabetes. Last but not least, you’ll build performance measures to assess your models and judge your predictions.
Welcome to the course!50 xpWhy tree-based methods?100 xpSpecify that tree100 xpTrain that model100 xpHow to grow your tree50 xpTrain/test split100 xpAvoiding class imbalances100 xpFrom zero to hero100 xpPredict and evaluate50 xpMake predictions100 xpCrack the matrix100 xpAre you predicting correctly?100 xp - 2
Regression Trees and Cross-Validation
Ready for some candy? Use a chocolate rating dataset to build regression trees and assess their performance using suitable error measures. You’ll overcome statistical insecurities of single train/test splits by applying sweet techniques like cross-validation and then dive even deeper by mastering the bias-variance tradeoff.
Continuous outcomes50 xpTrain a regression tree100 xpPredict new values100 xpInspect model output50 xpPerformance metrics for regression trees50 xpIn-sample performance100 xpOut-of-sample performance100 xpBigger mistakes, bigger penalty100 xpCross-validation50 xpCreate the folds100 xpFit the folds100 xpEvaluate the folds100 xpBias-variance tradeoff50 xpCall things by their names100 xpAdjust model complexity100 xpIn-sample and out-of-sample performance100 xp - 3
Hyperparameters and Ensemble Models
Time to get serious with tuning your hyperparameters and interpreting receiver operating characteristic (ROC) curves. In this chapter, you’ll leverage the wisdom of the crowd with ensemble models like bagging or random forests and build ensembles that forecast which credit card customers are most likely to churn.
Tuning hyperparameters50 xpGenerate a tuning grid100 xpTune along the grid100 xpPick the winner100 xpMore model measures50 xpCalculate specificity100 xpDraw the ROC curve100 xpArea under the ROC curve100 xpBagged trees50 xpCreate bagged trees100 xpIn-sample ROC and AUC100 xpCheck for overfitting100 xpRandom forest50 xpBagged trees vs. random forest50 xpVariable importance100 xp - 4
Boosted Trees
Ready for the high society of tree-based models? Apply gradient boosting to create powerful ensembles that perform better than anything that you have seen or built. Learn about their fine-tuning and how to compare different models to pick a winner for production.
Introduction to boosting50 xpBagging vs. boosting50 xpSpecify a boosted ensemble100 xpGradient boosting50 xpTrain a boosted ensemble100 xpEvaluate the ensemble100 xpCompare to a single classifier100 xpOptimize the boosted ensemble50 xpTuning preparation100 xpThe actual tuning100 xpFinalize the model100 xpModel comparison50 xpCompare AUC100 xpPlot ROC curves100 xpWrap-up50 xp
Treinar 2 ou mais pessoas?
Obtenha acesso à sua equipe à plataforma DataCamp completa, incluindo todos os recursos.Nas seguintes faixas
Fundamentos de aprendizado de máquina em R
Ir para a trilhaCientista de aprendizado de máquina in R
Ir para a trilhaAprendizado de máquina supervisionado em R
Ir para a trilhacolaboradores
pré-requisitos
Modeling with tidymodels in RSandro Raabe
Ver MaisData Scientist
O que os outros alunos têm a dizer?
Junte-se a mais de 15 milhões de alunos e comece Machine Learning with Tree-Based Models in R hoje mesmo!
Crie sua conta gratuita
ou
Ao continuar, você aceita nossos Termos de Uso, nossa Política de Privacidade e que seus dados são armazenados nos EUA.