Some probability theory and inequalities

Convergence in distribution

Let $X_1, X_2, ...,$ and $X$ be random variables with cdfs $F_1, F_2, ...$ and $F$ . If $F_n(x) \to F(x)$ then the sequence $X_n$ is said to converge in distribution to $X$ ($X_n \to_D $ ).

Convergence in probability

Let $X_1, X_2, ...,$ and $X$ be random variables on some probability space. Then if $P(|X_n - X| \gt \epsilon \to 0$ as $n \to \infinity \forall \epsilon \gt 0$ , the sequence $X_n$ is said to converge in probability to $X$ ($X_n \to_p X$ ).

Slutsky’s theorem

If $X_n \to_D X$ , $A_n \to_p a$ , $B_n \to_p b$ .

Then $A_n + B_n X_n \to_D a + bX$

Delta theorem

Suppose $\sqrt{n}(X_n - b) \to_D X$ . If $g: \reals \to \reals$ is differentiable eand $g'$ is continuous at $b$ then $\sqrt{n}{( g(X_n) - g(b))} \to_D g'(b)X$ .

Delta theorem ensures reparameterisation of ML has same asymptotic properties.

Jensen’s inequality

Let $D$ be an interval in $\reals$ . If $\phi: D \to \reals$ is convex, then for any random variable $X$ on $D$ , $\phi(E[X]) \le E[\phi(X)]$ .

Cauchy-Schwartz inequality

For any two random variables $X$ and $Y$ , st. $E(X^2) \lt \infinty$ , $E(Y^2) \lt \infinity$ , then $(E[XY])^2 \le E[X^2]E[Y^2]$ .