Additive error models

Exemplifies idea of signal + noise. `Y = f(x, beta) + epsi` and `epsi` distributed with some absolutely continuous density function, typically with `E(epsi) = 0`. Modelling task with these models focus on:

Constant variance models

“Backbone” of statistical modelling. Can use transformation if assumption violated (but often introduces other problems).

Linear and non-linear models

Define a model as non-linear if at least of the derivatives of `beta` depends on one or more other parameters. Two notions of non-linearity:

Models with known variance parameters

Relax assumption of constant variance. Use weighted regression either with known weights (ie. `1/sigma`) or weights as a function of other parameters, most typically power of the mean model. If we use functional weights, don’t get nice asymptotic theory.

Models with unknown variance parameters

`Y_p = g_1(x, beta) + sigma g_2(x, beta, z, theta)`

Common models:

Transformaing both sides

Attempt to allow separate adjustment for nonconstant variance, nonsymmetric error distributions and incorrect specification of expectation function. Idea is to transform both sides to help with first two problems without affecting third.