Multiparameter models.

Classical approach: maximize a joint likelihood, proceed in steps. Bayesian approach: base inference on marginal posterior distributions of interest, eg `p(theta_1 | y) = int p(theta_1, theta_2) dtheta_2 = int p(theta_1 | theta_2, y) p(theta_2 |y)`

Integral not usually calculated directly: take a 2 step simulation process

Normal model

Sampling distribution: `y_i ~ N(mu, sigma^2)`

Non-informative prior `p(mu, sigma^2) prop sigma^(-2)`

Conjugate prior: `mu | sigma^2 ~ N(bar y, sigma^2/kappa_0)`, `sigma^2 ~ Inv chi^2(eta_0, sigma^2_0)`:

Both have same posteriors:

Semi-conjugate prior: `mu ~ N(mu_0, tau^2_0)` `sigma^2 ~ inv chi^2(nu_0, sigma^2_0)`. Doesn't end up nice.

Multinomial model

Generalisation of binomial model.

Sampling distribution: `p(y | theta) prop Pi^k_(j=1) theta_j^(y_j)` Conjugate prior: Dirichlet `p(theta | alpha) prop pi^k_(j=1) theta_j^(alpha_j-1)` Posterior prior: Dirichlet: `alpha_j' = alpha_j + y_j`

Can also be represented as the product of `k` independent Poisson rvs `y_j ~ Poi(lambda_j)` with restriction `sum_j y_j = n`, then `theta_j = lambda_j / sum(lambda_i)`

Multivariate normal

Useful distributions and equivalences

Dirichlet-Gamma: Draw `x_1, ..., x_n` from `"Gamma"(delta, alpha_j)` for any common `delta`. Set `theta_j = x_j / sum(x_i)` `Inv-chi^2`-`chi^2`: `sigma^2 / chi^2(nu)`.