Cramer-Rao inequality

Motivation: Will eventually show that ML estimators converge in distribution at $\sqrt(n)$ rate, subject to very general regularity conditions. Ie., if the data are iid $\sqrt(\hat{\theta}_n - \theta) \arrow_D N_s(0, V(\theta))$ . More large n, MLE is approximately ~ $N(\theta, V(\theta)/n)$ .

Cramer-Rao inequality for $\theta \in \mathcal{R}$

Let:

If:

Then:

Proof:

CR inequality does not imply existence of an unbiased estimator that achieves the lower bound, or in fact, any unbiased estimator.

CR inequality for $g(\theta)$

If $\delta(X)$ is an unbiased estimator of $g(\theta)$ then $Var_\theta(\delta) \ge \frac{g'(\theta)^2}{I(\theta)}$ . Can be proved through minor modification of above proof, or when $g(\theta)$ is invertible, by reparameterising the likelihood function in terms of $\zeta = g(\theta)$ .

Alternative formulae

iid case

If indepedent, multiply likelihoods, add log-likelihoods, indentical so distributions all equal: $I_n(\theta) = n I_1(\theta)$ .

Multiparameter case

Results as for single parameter case, but in vector form: