Accéder au contenu principal
AccueilPython

Introduction to LLMs in Python

Learn the nuts and bolts of LLMs and the revolutionary transformer architecture they are based on!

Commencer Le Cours Gratuitement
4 heures17 vidéos57 exercices9 954 apprenantsTrophyDéclaration de réalisation

Créez votre compte gratuit

GoogleLinkedInFacebook

ou

En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données sont stockées aux États-Unis.
Group

Formation de 2 personnes ou plus ?

Essayer DataCamp for Business

Apprécié par les apprenants de milliers d'entreprises


Description du cours

Uncover What's Behind the Large Language Models Hype

Large Language Models (LLMs) have become pivotal tools driving some of the most stunning advancements and applications in today's AI landscape. This hands-on course will equip you with the practical knowledge and skills needed to understand, build, and harness the power of LLMs for solving complex language tasks such as translation, language generation, question-answering, and more.

Design your Own LLM Architecture and Leverage Pre-Trained Models

Through iteractive coding exercises, we will demystify transformers, the most popular deep-learning architecture for constructing LLMs and NLP systems. We'll also explore pre-trained language models and datasets from Hugging Face: a vast collection of resources for seamlessly integrating LLMs into your projects. By the end of this course, you will be able to build LLMs using various transformer architectures and configure, fine-tune, and evaluate pre-trained LLMs using specialized metrics. You will also gain insights into advanced concepts like Reinforcement Learning from Human Feedback (RLHF) and understand the key challenges and ethical considerations of enabling real-world LLM applications.
Pour les entreprises

Formation de 2 personnes ou plus ?

Donnez à votre équipe l’accès à la plateforme DataCamp complète, y compris toutes les fonctionnalités.
DataCamp Pour Les EntreprisesPour une solution sur mesure , réservez une démo.

Dans les titres suivants

Ingénieur IA associé pour les scientifiques de données

Aller à la piste

Développer de grands modèles linguistiques

Aller à la piste
  1. 1

    The Large Language Models (LLMs) Landscape

    Gratuit

    Large Language Models (LLMs) represent the current pinnacle of AI technology, driving remarkable advancements in Natural Language Processing and Understanding. This chapter serves as your gateway to comprehending LLMs: what they are, their remarkable capabilities, and the wide array of language tasks they excel at. You'll gain practical experience in loading and harnessing various LLMs for both language understanding and generation tasks. Along the way, you'll be introduced to the successful catalyst at the heart of most LLMs: the transformer architecture. Ready to start this journey into the world of LLMs?

    Jouez Au Chapitre Maintenant
    Introducing large language models
    50 xp
    Classifying a restaurant's customer review
    100 xp
    Pipeline re-arrangement puzzle
    100 xp
    Tasks LLMs can perform
    50 xp
    Using a pipeline for summarization
    100 xp
    Time for some question-answering!
    100 xp
    The transformer architecture
    50 xp
    Hello PyTorch transformer
    100 xp
    Hands-on translation pipeline
    100 xp
    Generating replies to customer reviews
    100 xp
  2. 2

    Building a Transformer Architecture

    In this chapter, you'll uncover the secrets and practical intricacies of transformers, the most popular deep learning architecture used to create today's most successful Language Models. Step by step, and aided by the PyTorch library, you'll learn how to manually design and configure different types of transformer architectures. You'll develop a strong understanding of their core elements, including self-attention mechanisms, encoder and decoder layers, and specialized model heads designed for specific language tasks and use cases.

    Jouez Au Chapitre Maintenant
  3. 3

    Harnessing Pre-trained LLMs

    This chapter unveils the transformative potential of harnessing pre-trained Large Language Models (LLMs). Throughout the chapter, you'll discover effective tips and tricks for mastering intricate language use cases and gain practical insights into leveraging pre-trained LLMs and datasets from Hugging Face. Along the way, you will learn the ins and outs of several common language problems, including sentiment classification to summarization to question-answering, and explore how LLMs are adaptively trained to solve them.

    Jouez Au Chapitre Maintenant
  4. 4

    Evaluating and Leveraging LLMs in the Real World

    Our exciting LLMs learning journey is approaching its end! You'll delve into different metrics and methods to assess how well your model is performing, whether it's a pre-trained one, a fine-tuned version, or something you've built from the ground up! You'll learn about the crucial aspects and challenges of applying Language Models in real-world scenarios, including optimizing a model with feedback from humans (RLHF), tackling biased language outputs, and more.

    Jouez Au Chapitre Maintenant
Pour les entreprises

Formation de 2 personnes ou plus ?

Donnez à votre équipe l’accès à la plateforme DataCamp complète, y compris toutes les fonctionnalités.

Dans les titres suivants

Ingénieur IA associé pour les scientifiques de données

Aller à la piste

Développer de grands modèles linguistiques

Aller à la piste

collaborateurs

Collaborator's avatar
James Chapman
Collaborator's avatar
Chris Harper
Collaborator's avatar
Jasmin Ludolf

audio enregistré par

Jasmin Ludolf's avatar
Jasmin Ludolf

prérequis

Intermediate Deep Learning with PyTorch
Iván P.C. HeadshotIván P.C.

Senior Data Science & AI Manager

Voir Plus

Qu’est-ce que les autres apprenants ont à dire ?

Inscrivez-vous 15 millions d’apprenants et commencer Introduction to LLMs in Python Aujourd’hui!

Créez votre compte gratuit

GoogleLinkedInFacebook

ou

En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données sont stockées aux États-Unis.