Skip to main content
HomeR

Bayesian Modeling with RJAGS

In this course, you'll learn how to implement more advanced Bayesian models using RJAGS.

Start Course for Free
4 hours15 videos58 exercises7,337 learnersTrophyStatement of Accomplishment

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.
Group

Training 2 or more people?

Try DataCamp for Business

Loved by learners at thousands of companies


Course Description

The Bayesian approach to statistics and machine learning is logical, flexible, and intuitive. In this course, you will engineer and analyze a family of foundational, generalizable Bayesian models. These range in scope from fundamental one-parameter models to intermediate multivariate & generalized linear regression models. The popularity of such Bayesian models has grown along with the availability of computing resources required for their implementation. You will utilize one of these resources - the rjags package in R. Combining the power of R with the JAGS (Just Another Gibbs Sampler) engine, rjags provides a framework for Bayesian modeling, inference, and prediction.
For Business

Training 2 or more people?

Get your team access to the full DataCamp platform, including all the features.
DataCamp for BusinessFor a bespoke solution book a demo.
  1. 1

    Introduction to Bayesian Modeling

    Free

    Bayesian models combine prior insights with insights from observed data to form updated, posterior insights about a parameter. In this chapter, you will review these Bayesian concepts in the context of the foundational Beta-Binomial model for a proportion parameter. You will also learn how to use the rjags package to define, compile, and simulate this model in R.

    Play Chapter Now
    The prior model
    50 xp
    Simulating a Beta prior
    100 xp
    Comparing & contrasting Beta priors
    100 xp
    Which prior?
    50 xp
    Data & the likelihood
    50 xp
    Simulating the dependence of X on p
    100 xp
    Approximating the likelihood function
    100 xp
    Interpreting the likelihood function
    50 xp
    The posterior model
    50 xp
    Define, compile, and simulate
    100 xp
    Updating the posterior
    100 xp
    Influence of the prior & data on the posterior
    50 xp
  2. 2

    Bayesian Models & Markov Chains

    The two-parameter Normal-Normal Bayesian model provides a simple foundation for Normal regression models. In this chapter, you will engineer the Normal-Normal and define, compile, and simulate this model using rjags. You will also explore the magic of the Markov chain mechanics behind rjags simulation.

    Play Chapter Now
  3. 3

    Bayesian Inference & Prediction

    In this chapter, you will extend the Normal-Normal model to a simple Bayesian regression model. Within this context, you will explore how to use rjags simulation output to conduct posterior inference. Specifically, you will construct posterior estimates of regression parameters using posterior means & credible intervals, you will test hypotheses using posterior probabilities, and you will construct posterior predictive distributions for new observations.

    Play Chapter Now
For Business

Training 2 or more people?

Get your team access to the full DataCamp platform, including all the features.

datasets

Sleep study data

collaborators

Collaborator's avatar
Chester Ismay
Collaborator's avatar
Nick Solomon
Collaborator's avatar
Eunkyung Park
Alicia Johnson HeadshotAlicia Johnson

Associate Professor, Macalester College

See More

What do other learners have to say?

Join over 15 million learners and start Bayesian Modeling with RJAGS today!

Create Your Free Account

GoogleLinkedInFacebook

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.