Introduction to LLMs in Python
Learn the nuts and bolts of LLMs and the revolutionary transformer architecture they are based on!
Kurs Kostenlos Starten4 Stunden17 Videos57 Übungen9.954 LernendeLeistungsnachweis
Kostenloses Konto erstellen
oder
Durch Klick auf die Schaltfläche akzeptierst du unsere Nutzungsbedingungen, unsere Datenschutzrichtlinie und die Speicherung deiner Daten in den USA.Trainierst du 2 oder mehr?
Versuchen DataCamp for BusinessBeliebt bei Lernenden in Tausenden Unternehmen
Kursbeschreibung
Uncover What's Behind the Large Language Models Hype
Large Language Models (LLMs) have become pivotal tools driving some of the most stunning advancements and applications in today's AI landscape. This hands-on course will equip you with the practical knowledge and skills needed to understand, build, and harness the power of LLMs for solving complex language tasks such as translation, language generation, question-answering, and more.Design your Own LLM Architecture and Leverage Pre-Trained Models
Through iteractive coding exercises, we will demystify transformers, the most popular deep-learning architecture for constructing LLMs and NLP systems. We'll also explore pre-trained language models and datasets from Hugging Face: a vast collection of resources for seamlessly integrating LLMs into your projects. By the end of this course, you will be able to build LLMs using various transformer architectures and configure, fine-tune, and evaluate pre-trained LLMs using specialized metrics. You will also gain insights into advanced concepts like Reinforcement Learning from Human Feedback (RLHF) and understand the key challenges and ethical considerations of enabling real-world LLM applications.Trainierst du 2 oder mehr?
Verschaffen Sie Ihrem Team Zugriff auf die vollständige DataCamp-Plattform, einschließlich aller Funktionen.In den folgenden Tracks
Associate AI Engineer für Datenwissenschaftler
Gehe zu TrackEntwicklung von großen Sprachmodellen
Gehe zu Track- 1
The Large Language Models (LLMs) Landscape
KostenlosLarge Language Models (LLMs) represent the current pinnacle of AI technology, driving remarkable advancements in Natural Language Processing and Understanding. This chapter serves as your gateway to comprehending LLMs: what they are, their remarkable capabilities, and the wide array of language tasks they excel at. You'll gain practical experience in loading and harnessing various LLMs for both language understanding and generation tasks. Along the way, you'll be introduced to the successful catalyst at the heart of most LLMs: the transformer architecture. Ready to start this journey into the world of LLMs?
Introducing large language models50 xpClassifying a restaurant's customer review100 xpPipeline re-arrangement puzzle100 xpTasks LLMs can perform50 xpUsing a pipeline for summarization100 xpTime for some question-answering!100 xpThe transformer architecture50 xpHello PyTorch transformer100 xpHands-on translation pipeline100 xpGenerating replies to customer reviews100 xp - 2
Building a Transformer Architecture
In this chapter, you'll uncover the secrets and practical intricacies of transformers, the most popular deep learning architecture used to create today's most successful Language Models. Step by step, and aided by the PyTorch library, you'll learn how to manually design and configure different types of transformer architectures. You'll develop a strong understanding of their core elements, including self-attention mechanisms, encoder and decoder layers, and specialized model heads designed for specific language tasks and use cases.
Attention mechanisms and positional encoding50 xpHands-on positional encoding100 xpWhy do we need positional encoding?50 xpMulti-headed self attention50 xpSetting up a multi-headed attention class100 xpImplementing multi-headed self-attention100 xpBuilding an encoder transformer50 xpPost-attention feed-forward layer100 xpTime for an encoder layer100 xpEncoder transformer body and head100 xpTesting the encoder transformer100 xpBuilding a decoder transformer50 xpBuilding a decoder body and head100 xpTesting the decoder transformer100 xpBuilding an encoder-decoder transformer50 xpIncorporating cross-attention in a decoder100 xpTrying out an encoder-decoder transformer100 xpTransformer assembly bottom-up100 xp - 3
Harnessing Pre-trained LLMs
This chapter unveils the transformative potential of harnessing pre-trained Large Language Models (LLMs). Throughout the chapter, you'll discover effective tips and tricks for mastering intricate language use cases and gain practical insights into leveraging pre-trained LLMs and datasets from Hugging Face. Along the way, you will learn the ins and outs of several common language problems, including sentiment classification to summarization to question-answering, and explore how LLMs are adaptively trained to solve them.
LLMs for text classification and generation50 xpPipelines vs auto classes50 xpClassifying two movie opinions100 xpTidying up a text generation use case100 xpLLMs for text summarization and translation50 xpSummarizing a product opinion100 xpThe Spanish phrasebook mission100 xpLLMs for question answering50 xpLoad and inspect a QA dataset100 xpExtract and decode the answer100 xpLLM fine-tuning and transfer learning50 xpFine-tuning preparations100 xpThe inside-out LLM100 xpMatching LLM use cases and architectures100 xp - 4
Evaluating and Leveraging LLMs in the Real World
Our exciting LLMs learning journey is approaching its end! You'll delve into different metrics and methods to assess how well your model is performing, whether it's a pre-trained one, a fine-tuned version, or something you've built from the ground up! You'll learn about the crucial aspects and challenges of applying Language Models in real-world scenarios, including optimizing a model with feedback from humans (RLHF), tackling biased language outputs, and more.
Guidelines and standard metrics for evaluating LLMs50 xpCalculating accuracy100 xpBeyond accuracy: describing metrics100 xpBeyond accuracy: using metrics100 xpSpecialized metrics for language tasks50 xpPerplexed about 2030100 xpA feast of LLM metrics100 xpBLEU-proof translations100 xpModel fine-tuning using human feedback50 xpThe role of a reward model50 xpSetting up an RLHF loop100 xpChallenges and ethical considerations50 xpToxic employee reviews?100 xpBest "regard"!100 xpThe finish line50 xp
Trainierst du 2 oder mehr?
Verschaffen Sie Ihrem Team Zugriff auf die vollständige DataCamp-Plattform, einschließlich aller Funktionen.In den folgenden Tracks
Associate AI Engineer für Datenwissenschaftler
Gehe zu TrackEntwicklung von großen Sprachmodellen
Gehe zu TrackMitwirkende
Audio aufgenommen von
Voraussetzungen
Intermediate Deep Learning with PyTorchIván P.C.
Mehr AnzeigenSenior Data Science & AI Manager
Was sagen andere Lernende?
Melden Sie sich an 15 Millionen Lernende und starten Sie Introduction to LLMs in Python Heute!
Kostenloses Konto erstellen
oder
Durch Klick auf die Schaltfläche akzeptierst du unsere Nutzungsbedingungen, unsere Datenschutzrichtlinie und die Speicherung deiner Daten in den USA.