Monitoring Machine Learning Concepts
Learn about the challenges of monitoring machine learning models in production, including data and concept drift, and methods to address model degradation.
Start Course for Free2 hours11 videos33 exercises2,145 learnersStatement of Accomplishment
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.Training 2 or more people?
Try DataCamp for BusinessLoved by learners at thousands of companies
Course Description
Machine Learning Monitoring Concepts
Machine learning models influence more and more decisions in the real world. These models need monitoring to prevent failure and ensure that they provide business value to your company. This course will introduce you to the fundamental concepts of creating a robust monitoring system for your models in production.Discover the Ideal Monitoring Workflow
The course starts with the blueprint of where to begin monitoring in production and how to structure the processes around it. We will cover basic workflow by showing you how to detect the issues, identify root causes, and resolve them with real-world examples.Explore the Challenges of Monitoring Models in Production
Deploying a model in production is just the beginning of the model lifecycle. Even if it performs well during development, it can fail due to continuously changing production data. In this course, you will explore the difficulties of monitoring a model’s performance, especially when there’s no ground truth.Understand in Detail Covariate Shift and Concept Drift
The last part of this course will focus on two types of silent model failure. You will understand in detail the different kinds of covariate shifts and concept drift, their influence on the model performance, and how to detect and prevent them.Training 2 or more people?
Get your team access to the full DataCamp platform, including all the features.In the following Tracks
Associate AI Engineer for Data Scientists
Go To TrackMachine Learning Engineer
Go To TrackMachine Learning in Production in Python
Go To Track- 1
What is ML Monitoring
FreeThe first chapter will explain why businesses need to monitor your machine learning models in production. You will learn about the ideal monitoring workflow and the steps involved, as well as some of the challenges that monitoring systems can face in production.
Why you need to monitor your model50 xpWhy models fail?50 xpThe benefits of monitoring systems50 xpThe ideal monitoring workflow50 xpThe importance of monitoring KPIs50 xpIdeal monitoring workflow100 xpMonitoring workflow in real-life scenario100 xpChallenges of monitoring ML models50 xpDelayed ground truth100 xpCovariate shift vs concept drift100 xp - 2
Theoretical Concepts of monitoring
In Chapter 2, you'll discover the fundamental importance of performance monitoring in a reliable monitoring system. We'll explore the common challenges faced in real-world production environments, such as the availability of ground truth. By the end of the chapter, you'll know how to handle situations when ground truth data is delayed or absent , using performance estimation algorithms.
Monitoring technical performance directly50 xpWhy performance first?50 xpDifferent covariate shifts50 xpAvailability of ground truth50 xpIs performance estimation required?50 xpPerformance monitoring in production100 xpPerformance estimation50 xpCBPE considerations50 xpPerformance estimation in production50 xpAlgorithms for performance estimation100 xp - 3
Covariate Shift and Concept Drift Detection
Now that you know the basics of covariate shift and concept drift in production, let''s dive a little bit deeper. At the end of this chapter, you will know the different ways to detect and handle them in real-world scenarios.
What is covariate shift?50 xpSwapped features50 xpDrift detection method50 xpHow to detect covariate shift50 xpSubtle changes50 xpDifferent methods100 xpWhat is concept drift?50 xpType of concept drift50 xpConcept drift and covariate shift interaction50 xpHow to handle concept drift?50 xpReal-time adaptation50 xpConcept drift detection and resolution100 xpWrap-up50 xp
Training 2 or more people?
Get your team access to the full DataCamp platform, including all the features.In the following Tracks
Associate AI Engineer for Data Scientists
Go To TrackMachine Learning Engineer
Go To TrackMachine Learning in Production in Python
Go To Trackcollaborators
Hakim Elakhrass
See MoreCo-founder and CEO of NannyML
Hakim is one of the co-founders of nannyML, one of the most popular open source machine learning model monitoring libraries. He has almost a decade of data science experience. Hakim holds a Masters Degree in Bioinformatics from the KU Leuven.
What do other learners have to say?
Join over 15 million learners and start Monitoring Machine Learning Concepts today!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.