0. Logistical Info
- Section date: 11/1
- Associated lectures: 10/24, 10/26
- Associated pset: Pset 7, due 11/3
- Office hours on 11/1 from 7-9pm at Quincy Dining Hall
- Remember to fill out the attendance form
0.1 Summary + Practice Problem PDFs
Summary + Practice Problems PDF
Practice Problem Solutions PDF
1. Moment Generating Functions (MGFs)
For a random variable
Useful MGF results:
- For independent random variables,
with MGFs , then . - For random variable
and scalars , since .
A distribution is uniquely determined by any of the following:
- PMF (common for discrete),
- PDF,
- CDF (common for continuous),
- MGF, or
- matching to a named distribution (common).
2. Poisson Processes
Consider a problem similar to Blissville/Blotchville, where
- For any interval in time of length
, the number of arrivals in that interval is distributed . - For any non-overlapping (disjoint) intervals of time, the number of bus arrivals are independent.
This applies for any “arrival process” where
correspond to arrival times.
Results:
- Inter-arrival times: In a Poisson process with rate
, the inter-arrival times (the time for the first arrival, , and the times between consecutive arrives ) are each independently distributed
- Count-time duality: Fix a time
. Let be the number of arrivals in the time interval , and let be the arrival time of the -th arrival. Then
3. Marginal, Conditional, and Joint Distributions
Marginal, conditional, and joint distributions
Consider two random variables
Joint | Marginal | Conditional | |
---|---|---|---|
Distribution | |||
PMF | |||
CDF |
For example,
- Marginalization:
If we know the joint distribution of random variables
, then we can find the marginal distribution of (and analogously, ) by LOTP:
- Joint from marginal and conditional:
If we know the marginal distribution of
and the conditional distributions for any , then we can find the joint distribution of by factoring out our probability:
- 2D LOTUS:
Let
be random variables with known joint distribution. For , LOTUS extends to 2 dimensions (or analogously for any larger dimensions) to give
4. Covariance and Correlation
Covariance and Correlation
The correlation of random variables
- positively correlated if
, - negatively correlated if
, - uncorrelated if
.
Since correlation and covariance have the same sign, this also applies for positive/negative/zero covariance.
Properties of covariance: see page 327 in Blitzstein & Huang for full list. Let
- If
are independent, then (so are uncorrelated). . .- This can be especially useful for finding the variance of a sum of indicators.
.
The last two properties are referred to as bilinearity.
Properties of correlation
Let
- -If
are independent, then (so are uncorrelated) .