Hyperparameter Tuning in R
Learn how to tune your model's hyperparameters to get the best predictive results.
Comienza El Curso Gratis4 horas14 vídeos47 ejercicios7094 aprendicesDeclaración de cumplimiento
Crea Tu Cuenta Gratuita
o
Al continuar, acepta nuestros Términos de uso, nuestra Política de privacidad y que sus datos se almacenan en los EE. UU.¿Entrenar a 2 o más personas?
Probar DataCamp for BusinessPreferido por estudiantes en miles de empresas
Descripción del curso
For many machine learning problems, simply running a model out-of-the-box and getting a prediction is not enough; you want the best model with the most accurate prediction. One way to perfect your model is with hyperparameter tuning, which means optimizing the settings for that specific model. In this course, you will work with the caret, mlr and h2o packages to find the optimal combination of hyperparameters in an efficient manner using grid search, random search, adaptive resampling and automatic machine learning (AutoML). Furthermore, you will work with different datasets and tune different supervised learning models, such as random forests, gradient boosting machines, support vector machines, and even neural nets. Get ready to tune!
¿Entrenar a 2 o más personas?
Obtén a tu equipo acceso a la plataforma DataCamp completa, incluidas todas las funciones.En las siguientes pistas
Científico de machine learning in R
Ir a la pistaAprendizaje automático supervisado en R
Ir a la pista- 1
Introduction to hyperparameters
GratuitoWhy do we use the strange word "hyperparameter"? What makes it hyper? Here, you will understand what model parameters are, and why they are different from hyperparameters in machine learning. You will then see why we would want to tune them and how the default setting of caret automatically includes hyperparameter tuning.
Parameters vs hyperparameters50 xpModel parameters vs. hyperparameters100 xpHyperparameters in linear models50 xpWhat are the coefficients?100 xpRecap of machine learning basics50 xpMachine learning with caret100 xpResampling schemes50 xpHyperparameter tuning in caret50 xpHyperparameters in Stochastic Gradient Boosting50 xpChanging the number of hyperparameters to tune100 xpTune hyperparameters manually100 xp - 2
Hyperparameter tuning with caret
In this chapter, you will learn how to tune hyperparameters with a Cartesian grid. Then, you will implement faster and more efficient approaches. You will use Random Search and adaptive resampling to tune the parameter grid, in a way that concentrates on values in the neighborhood of the optimal settings.
Hyperparameter tuning in caret50 xpFinding hyperparameters50 xpCartesian grid search in caret100 xpPlot hyperparameter model output100 xpGrid vs. Random Search50 xpGrid search with range of hyperparameters100 xpFind train() option for random search50 xpRandom search with caret100 xpAdaptive resampling50 xpAdvantages of Adaptive Resampling50 xpAdaptive Resampling with caret100 xp - 3
Hyperparameter tuning with mlr
Here, you will use another package for machine learning that has very convenient hyperparameter tuning functions. You will define a Cartesian grid or perform Random Search, as well as advanced techniques. You will also learn different ways to plot and evaluate models with different hyperparameters.
Machine learning with mlr50 xpMachine Learning with mlr50 xpModeling with mlr100 xpGrid and random search with mlr50 xpRandom search with mlr100 xpPerform hyperparameter tuning with mlr100 xpEvaluating hyperparameters with mlr50 xpWhy to evaluate tuning?50 xpEvaluating hyperparameter tuning results100 xpAdvanced tuning with mlr50 xpDefine advanced tuning controls50 xpDefine aggregated measures100 xpSetting hyperparameters100 xp - 4
Hyperparameter tuning with h2o
In this final chapter, you will use h2o, another package for machine learning with very convenient hyperparameter tuning functions. You will use it to train different models and define a Cartesian grid. Then, You will implement a Random Search use stopping criteria. Finally, you will learn AutoML, an h2o interface which allows for very fast and convenient model and hyperparameter tuning with just one function.
Machine learning with h2o50 xpPrepare data for modelling with h2o100 xpModeling with h2o100 xpGrid and random search with h2o50 xpGrid search with h2o100 xpRandom search with h2o100 xpStopping criteria100 xpAutomatic machine learning with H2O50 xpAutoML in h2o100 xpScoring the leaderboard50 xpExtract h2o models and evaluate performance100 xpWrap-up50 xp
¿Entrenar a 2 o más personas?
Obtén a tu equipo acceso a la plataforma DataCamp completa, incluidas todas las funciones.En las siguientes pistas
Científico de machine learning in R
Ir a la pistaAprendizaje automático supervisado en R
Ir a la pistaconjuntos de datos
Bc test dataBc train dataBreast cancer dataBreast cancer data origDatasets descriptionsKnowledge dataKnowledge origKnowledge test dataKnowledge train dataSeeds dataSeeds datasetSeeds test dataSeeds train dataVoters dataVoters origVoters test dataVoters train datacolaboradores
requisitos previos
Machine Learning with caret in RShirin Elsinghorst (formerly Glander)
Ver MásData Scientist @ codecentric
¿Qué tienen que decir otros alumnos?
¡Únete a 15 millones de estudiantes y empieza Hyperparameter Tuning in R hoy mismo!
Crea Tu Cuenta Gratuita
o
Al continuar, acepta nuestros Términos de uso, nuestra Política de privacidad y que sus datos se almacenan en los EE. UU.