Missing piece in a jigsaw.
Bayesian statistics
Apr 26, 2021 | 11 min read

A Bayesian Perspective on Missing Data Imputation

This lecture discusses some approaches to handling missing data, primarily when missingness occurs completely randomly. We discuss a procedure, MICE, which uses Gibbs sampling to create multiple "copies" of filled-in datasets.

Sparrow with a twig.
Bayesian statistics
Apr 19, 2021 | 8 min read

Bayesian Generalized Linear Models

This lecture discusses a simple logistic regression model for predicting a binary variable. GLMs are necessary when the response variable cannot be modeled appropriately by a normal distribution, and use a link function to connect parameters of the response distribution to the covariates.

Lego heads.
Bayesian statistics
Apr 12, 2021 | 18 min read

Penalized Linear Regression and Model Selection

This lecture covers some Bayesian connections to penalized regression methods such as ridge regression and the LASSO. Further discussion of the posterior predictive distribution as well as model selection criterion (DIC) is included.

Curvy road in Sedona, Arizona.
Bayesian statistics
Apr 5, 2021 | 18 min read

Bayesian Linear Regression

The main difference with traditional approaches is in the specification of prior distributions for the regression parameters, which relate covariates to a continuous response variable. However, the Bayesian approach also provides a fairly intuitive way to add random effects (such as a random intercept or random slope), which results in what is traditionally known as a linear mixed model.

Cherry tree.
Bayesian statistics
Mar 29, 2021 | 18 min read

Hierarchical Models

This model is useful for accommodating data which are grouped or having multiple levels, as the main feature is the addition of a between-group layer which relates groups to each other. The presence of this layer forces group-level parameters to be more similar to each other, displaying the important properties of partial pooling and shrinkage.

San Francisco 2077.
Bayesian statistics
Mar 22, 2021 | 17 min read

Metropolis-Hastings Algorithms

This lecture discusses the Metropolis and Metropolis-Hastings algorithms, two more tools for sampling from the posterior distribution when we do not have it in closed form. These are used when we are unable to obtain full conditional distributions. MCMC for the win!

Two ducks.
Bayesian statistics
Mar 15, 2021 | 17 min read

The Normal Model in a Two Parameter Setting

This lecture discusses Bayesian inference of the normal model, particularly the case where we are interested in joint posterior inference of the mean and variance simultaneously. We discuss approaches to prior specification, and introduce the Gibbs sampler as a way to generate posterior samples if full conditional distributions of the parameters are available in closed-form.

Dominoes (more importantly, numbers).
Bayesian statistics
Feb 22, 2021 | 10 min read

Bayesian Inference for the Normal Model

The normal distribution has two parameters, but we focus on the one-parameter setting in this lecture. We also introduce the posterior predictive check as a way to assess model fit, and briefly discuss the issue with improper prior distributions.

The lottery.
Bayesian statistics
Feb 15, 2021 | 8 min read

Monte Carlo Sampling

This lecture discusses Monte Carlo approximations of the posterior distribution and summaries from it. While this might not seem entirely useful now, this underlies some of the key computational methods used for Bayesian inference that we will discuss further.

Superman logo.
Data mining
Feb 12, 2021 | 4 min read

Logistic Regression

In linear regression, the function learned is used to estimate the value of the target $y$ using values of input $x$. While it could be used for …