Additive error models
Exemplifies idea of signal + noise. `Y = f(x, beta) + epsi` and `epsi` distributed with some absolutely continuous density function, typically with `E(epsi) = 0`. Modelling task with these models focus on:
- appropriate specification of `f`
- modelling of the variance of the additive errors
Constant variance models
“Backbone” of statistical modelling. Can use transformation if assumption violated (but often introduces other problems).
Linear and non-linear models
Define a model as non-linear if at least of the derivatives of `beta` depends on one or more other parameters. Two notions of non-linearity:
- intrinsic curvature: cannot be reparameterised away, reflection of intrinsic curvature of exepectation surface
- parameter effects: if equally spaced points on domain of expectation surface are mapped to inequally spaced points in the range
Models with known variance parameters
Relax assumption of constant variance. Use weighted regression either with known weights (ie. `1/sigma`) or weights as a function of other parameters, most typically power of the mean model. If we use functional weights, don’t get nice asymptotic theory.
Models with unknown variance parameters
`Y_p = g_1(x, beta) + sigma g_2(x, beta, z, theta)`Common models:
- `g = sigma(mu(beta))^theta`
- `g = sigma exp(theta mu(beta))`
- `g = theta_0 + theta_1 x + theta_2 x^2`
- `g = theta_0 + theta_1 z + theta_2 x^2`
Transformaing both sides
Attempt to allow separate adjustment for nonconstant variance, nonsymmetric error distributions and incorrect specification of expectation function. Idea is to transform both sides to help with first two problems without affecting third.