Intermediate Regression in R
Learn to perform linear and logistic regression with multiple explanatory variables.
Start Course for Free4 hours14 videos50 exercises26,365 learnersStatement of Accomplishment
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.Training 2 or more people?
Try DataCamp for BusinessLoved by learners at thousands of companies
Course Description
Linear regression and logistic regression are the two most widely used statistical models and act like master keys, unlocking the secrets hidden in datasets. This course builds on the skills you gained in "Introduction to Regression in R", covering linear and logistic regression with multiple explanatory variables. Through hands-on exercises, you’ll explore the relationships between variables in real-world datasets, Taiwan house prices and customer churn modeling, and more. By the end of this course, you’ll know how to include multiple explanatory variables in a model, understand how interactions between variables affect predictions, and understand how linear and logistic regression work.
Training 2 or more people?
Get your team access to the full DataCamp platform, including all the features.In the following Tracks
Machine Learning Scientist in R
Go To TrackStatistician in R
Go To Track- 1
Parallel Slopes
FreeExtend your linear regression skills to "parallel slopes" regression, with one numeric and one categorical explanatory variable. This is the first step towards conquering multiple linear regression.
Parallel slopes linear regression50 xpFitting a parallel slopes linear regression100 xpInterpreting parallel slopes coefficients100 xpVisualizing each explanatory variable100 xpVisualizing parallel slopes100 xpPredicting parallel slopes50 xpPredicting with a parallel slopes model100 xpManually calculating predictions100 xpAssessing model performance50 xpComparing coefficients of determination100 xpComparing residual standard error100 xp - 2
Interactions
Explore the effect of interactions between explanatory variables. Considering interactions allows for more realistic models that can have better predictive power. You'll also deal with Simpson's Paradox: a non-intuitive result that arises when you have multiple explanatory variables.
Models for each category50 xpOne model per category100 xpPredicting multiple models100 xpVisualizing multiple models100 xpAssessing model performance100 xpOne model with an interaction50 xpSpecifying an interaction100 xpInteractions with understandable coeffs100 xpMaking predictions with interactions50 xpPredicting with interactions100 xpManually calculating predictions with interactions100 xpSimpson's Paradox50 xpModeling eBay auctions100 xpModeling each auction type100 xp - 3
Multiple Linear Regression
See how modeling, and linear regression in particular, makes it easy to work with more than two explanatory variables. Once you've mastered fitting linear regression models, you'll get to implement your own linear regression algorithm.
Two numeric explanatory variables50 xp3D visualizations100 xpModeling 2 numeric explanatory variables100 xpIncluding an interaction100 xpMore than 2 explanatory variables50 xpVisualizing many variables100 xpDifferent levels of interaction100 xpPredicting again100 xpHow linear regression works50 xpThe sum of squares50 xpLinear regression algorithm100 xp - 4
Multiple Logistic Regression
Extend your logistic regression skills to multiple explanatory variables. Understand the logistic distribution, which underpins this form of regression. Finally, implement your own logistic regression algorithm.
Multiple logistic regression50 xpVisualizing multiple explanatory variables100 xpLogistic regression with 2 explanatory variables100 xpLogistic regression prediction100 xpConfusion matrix100 xpThe logistic distribution50 xpCumulative distribution function100 xpInverse cumulative distribution function100 xpbinomial family argument100 xpLogistic distribution parameters50 xpHow logistic regression works50 xpLikelihood & log-likelihood50 xpLogistic regression algorithm100 xpCongratulations50 xp
Training 2 or more people?
Get your team access to the full DataCamp platform, including all the features.In the following Tracks
Machine Learning Scientist in R
Go To TrackStatistician in R
Go To Trackcollaborators
audio recorded by
prerequisites
Introduction to Regression in RRichie Cotton
See MoreData Evangelist at DataCamp
Richie is a Data Evangelist at DataCamp. He has been using R since 2004, in the fields of proteomics, debt collection, and chemical health and safety. He has released almost 30 R packages on CRAN and Bioconductor – most famously the assertive suite of packages – as well as creating and contributing to many others. He also has written two books on R programming, Learning R and Testing R Code.
Join over 15 million learners and start Intermediate Regression in R today!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.