Introducing consistency, a concept about the convergence of estimators. We start from the convergence of non-random number sequences to convergence in probability, then to consistency of estimators and its properties.
Functions of Random Variables
Finding the distribution of a real-valued function of multiple random variables. There's the method of distribution functions, transformations and moment generating functions.
Multivariate Probability Distributions
Joint probability distributions of two or more random variables defined on the same sample space. Also covers independence, conditional expectation and total expectation.
In this chapter we introduce the concept of linear models. We use the ordinary least squares estimator to get unbiased estimates of the unknown parameters. $R^2$ is introduced as a measure of the goodness of fit, and the different types of sum of squares in a linear model are briefly discussed.