0. Logistical Info
- Section date: 10/18
- Associated lecture: 10/12
- Associated pset: Pset 5, due 10/20
- Office hours on 10/18 from 7-9pm at Quincy Dining Hall
- Remember to fill out the attendance form
0.1 Summary + Practice Problem PDFs
Summary + Practice Problems PDF
Practice Problem Solutions PDF
1. Continuous Random Variables
A continuous random variable has an interval for its support.
- More precisely: a continuous random variable has an uncountable support, while discrete random variables have finite/countably infinite supports.
We heavily lean on the cumulative distribution function (CDF): for any random variable $X$, the CDF $F: \mathbb R \to [0, 1]$ is defined $F(x) = P(X \le x)$.
We don’t use the probability mass function (PMF) anymore, because $P(X = x) = 0$ for any $x$. This probability $0$ does not mean “impossible,” though. We instead use the probability density function:
1.1 Uses of CDFs and PDFs
For any random variable $X$ (continuous or discrete), you can use the CDF to calculate the following: \begin{align*} P(X > x) &= 1 - P(X \le x) = 1-F(x)\\ P(x_1 < X \le x_2) &= P(X \le x_2) - P(X \le x_1) = F(x_2) - F(x_1) \end{align*} For the CDFs of continuous random variables,
- You can assume the CDF is differentiable.
- $P(X \le x) = P(X < x)$, so you can swap out $\le$ and $<$ in calculations like the above.
For a continuous random variable, we can find the probabilities of intervals by integrating the PDF and adjusting the bounds: \begin{align*} P(X \le x) = P(X < x) &= \int_{-\infty}^x f(x) dx\\ P(X \ge x) = P(X > x) &= \int_{x}^{\infty} f(x) dx\\ P(x_1 < X < x_2) &= \int_{x_1}^{x_2} f(x) dx. \end{align*}
1.2 Continuous analogs of all of our tools
The general rules are:
- Integrals instead of sums
- PDFs instead of PMFs
So here’s a table with the tools we’ve talked about:
Tool | Discrete | Continuous |
---|---|---|
Expectation | $E(X) = \sum_{x} x P(X = x)$ | $E(X) = \int_{-\infty}^{\infty} x f_X(x) dx$ |
LOTUS | $E(g(X)) = \sum_x g(x) P(X = x)$ | $E(g(X)) = \int_{-\infty}^\infty g(x) f_X(x) dx$ |
Bayes’ rule | $P(X = x \vert Y = y) = \frac{P(Y=y \vert X = x) P(X = x)}{P(A)}$ | $f_{X\vert Y=y}(x) = \frac{f_{Y\vert X=x}(y) f_X(x)}{f_Y(y)}$ |
2. Uniform
For any interval $(a, b)$, we can define a uniform distribution with that support, denoted $U \sim \mathrm{Unif}(a, b)$. A uniform distribution is equivalent to having a constant PDF over the support. There is no uniform whose support is the full real line. \begin{align*} f_U(x) &= \begin{cases} \frac{1}{b-a} & x \in (a, b)\\ 0 & x \notin (a, b) \end{cases}\\ F_U(x) = P(U < x) &= \begin{cases} 0 & x \le a\\\ \frac{x-a}{b-a} & x \in (a, b)\\ 1 & x \ge b \end{cases} \end{align*}