Skip to main content
HomePython

Winning a Kaggle Competition in Python

Learn how to approach and win competitions on Kaggle.

Start Course for Free
4 hours16 videos52 exercises18,453 learnersTrophyStatement of Accomplishment

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.
Group

Training 2 or more people?

Try DataCamp for Business

Loved by learners at thousands of companies


Course Description

Kaggle is the most famous platform for Data Science competitions. Taking part in such competitions allows you to work with real-world datasets, explore various machine learning problems, compete with other participants and, finally, get invaluable hands-on experience. In this course, you will learn how to approach and structure any Data Science competition. You will be able to select the correct local validation scheme and to avoid overfitting. Moreover, you will master advanced feature engineering together with model ensembling approaches. All these techniques will be practiced on Kaggle competitions datasets.
For Business

Training 2 or more people?

Get your team access to the full DataCamp platform, including all the features.
DataCamp for BusinessFor a bespoke solution book a demo.

In the following Tracks

Machine Learning Scientist in Python

Go To Track
  1. 1

    Kaggle competitions process

    Free

    In this first chapter, you will get exposure to the Kaggle competition process. You will train a model and prepare a csv file ready for submission. You will learn the difference between Public and Private test splits, and how to prevent overfitting.

    Play Chapter Now
    Competitions overview
    50 xp
    Explore train data
    100 xp
    Explore test data
    100 xp
    Prepare your first submission
    50 xp
    Determine a problem type
    50 xp
    Train a simple model
    100 xp
    Prepare a submission
    100 xp
    Public vs Private leaderboard
    50 xp
    What model is overfitting?
    50 xp
    Train XGBoost models
    100 xp
    Explore overfitting XGBoost
    100 xp
  2. 2

    Dive into the Competition

    Now that you know the basics of Kaggle competitions, you will learn how to study the specific problem at hand. You will practice EDA and get to establish correct local validation strategies. You will also learn about data leakage.

    Play Chapter Now
  3. 4

    Modeling

    Time to bring everything together and build some models! In this last chapter, you will build a base model before tuning some hyperparameters and improving your results with ensembles. You will then get some final tips and tricks to help you compete more efficiently.

    Play Chapter Now
For Business

Training 2 or more people?

Get your team access to the full DataCamp platform, including all the features.

In the following Tracks

Machine Learning Scientist in Python

Go To Track

datasets

Demand forecasting (train)Demand forecasting (test)House prices (train)House prices (test)Taxi rides (train)Taxi rides (test)

collaborators

Collaborator's avatar
Hillary Green-Lerman
Collaborator's avatar
Hadrien Lacroix
Yauhen Babakhin HeadshotYauhen Babakhin

Kaggle Grandmaster

Yauhen holds a Master’s Degree in Applied Data Analysis and has over 5 years of working experience in Data Science. He worked in Banking, Gaming and eCommerce domains. Yauhen is also the first Kaggle competitions Grandmaster in Belarus having gold medals in both classic Machine Learning and Deep Learning competitions.
See More

What do other learners have to say?

Join over 15 million learners and start Winning a Kaggle Competition in Python today!

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.