Probability spaces
Kolmogorov’s probability model
- finite sample spaces
- countably infinite sample spaces
- uncountable sample spaces: random variables, random vectors, random trajectories
Random variables and random vectors
Change of variable formula: Let `X` be an rv on `(Omega, F, P)`, and `h: RR -> RR` be measurable. Let `Y = h(X)` then:
- `int_Omega |Y| dP = int_RR |h(x)| P_X(dx) = int_RR |y| P_Y(dy)`
- if `int_Omega |Y| dP < oo` then `int_Omega Y dP = int_RR h(x) P_X(dx) = int_RR y P_Y(dy)`
- if `h>=0` relationship still holds even if `EY = oo`
Moments
For any positive integer `n`, the nth moment, `mu_n` of an rv X is defined by `mu_n = EX^n`. The moment generating function is defined as `M_X(t) = E(e^(tX)) AA t in RR`. Since `e^(tX) > 0`, `E(e^(tX))` is well defined. If `X` is a non-negative rv then `M_X(t) = sum_(n=0)^oo (t^n mu_n)/(n!)`.
`X` an rv, `M_X(t) < oo AA |t| < epsi` for some `epsi > 0`, then:- `E|X^n| < oo AA n >= 1`
- `M_X(t) = sum (t^n mu_n)/(n!) AA |t| < epsi`
- `M_X(t)` is infinitely differentiable on `(-epsi, +epsi)` and the `r`th derivative is `E(e^(tX) X^r)`.
If the mgf is finite `AA |t| < epsi, epsi >0` then `M_X(t)` has a power series expansion in `t` around 0, and `mu_n/n!` is simply the coefficient of `t^n`. Note, the MGF uniquely defines the moments, but the moments do not uniquely define the MGF without extra conditions eg. `sum mu_2n^(-1/2n) = oo` or distribution has bounded support.
Product moments
Let `vecX = (X_1, X_2, ..., X_k)` be a random vector. The product moment of order `r = (r_1, r_2, ..., r_k)` `mu_r = mu_(r_1, r_2, ..., r_k) = E(X_1^(r_1) X_2^(r_2) ... X_k^(r_k))`. Similar properties as above apply.
Kolmogorov’s consistency theorem
A stochastic process with index set `A` is a family `{X_alpha : alpha in A}` of random variables defined on a ps. May also be viewed as a random real valued function on the set `A` by the identification `omega -> f(omega, *)` where `f(omega, alpha) = X_alpha (omega)` for `alpha in A`.
The family `{mu_((alpha_1, alpha_2, ..., alpha_k))}(A) = P((X_(alpha_1), X_(alpha_2), ...,, X_(alpha_k) in A) : a_i in A)` of probability distributions is called family of finite dimensional distributions (fdd) associated with `{X_alpha: alpha in A}`. Satisfies the following consistency conditions:
- `mu_((alpha_1, alpha_2, ..., alpha_k))(B_1 xx B_2 xx ... xx B_(k-1) xx RR) = mu_((alpha_1, alpha_2, ..., alpha_(k_1))(B_1 xx B_2 xx ... xx B_(k-1))`
- for any permutation `(i_1, i_2, ..., i_k)` of `(1, 2, ..., k)`, `mu((alpha_(i_1), ..., alpha_(i_k)))(B_(i_1) xx ... xx B_(i_k)) = mu(alpha_1, ..., alpha_k)(B_1 xx ... xx B_k)`
(If `A` is countable, and indexed with `NN`, these conditions are equivalent to: `mu_n` is a pm on `(RR^n, B(RR^n))`, and `mu_(n+1)(B xx RR) = mu_n(B), AA n in NN`)
Given a family of probability distributions `Q_A`, does there exist a real-valued stochastic process `{X_alpha: alpha in A}` st. its fdds coincides with `Q_A`. Kolmogorov’s consistency theorem. Let `A != O/`, `Q_A = {nu_((alpha_1, ..., alpha_k)) : alpha_i in A}` st. `nu_(...)` is a probability distribution on `(RR^k, B(RR^k))` and the consistency conditions hold. Then there exists a probability space and a stochastic process st `Q_A` is the fdds associated with the stochastic process.
In other words, if `Q_A` satisfies the appropriate conditions, there exists `f: A xx Omega -> RR` st `AA omega, f(*, omega)` is a function on `A`, and for each `(alpha_1, ..., alpha_k) in A^k` the vector `(f(alpha_1, omega), ...,, f(alpha_k, omega))` is a random vector with probability distribution `nu(alpha_1, ..., alpha_k)`
Sketch proof
`A = {a}`
`Q_A = {nu_a}`, a single probability distribution. Take `(Omega=RR, F = B(RR), PX^(-1) = nu_a), X(omega) = omega`. Then `X` is an rv on `(Omega, F, P)``A = {a_1, a_2, ..., a_k}, k < oo`
`Q_A = {nu_(a_(alpha_1), ..., a_(alpha_k)): alpha_1, ..., alpha_k in (1,2,...k)}`. Take `(Omega = RR^k, F = B(RR^k), P = nu_(a_1, ..., a_k)), X(omega) = omega`.`A = NN`
`Omega = RR^NN = {omega: omega in (x_1, x_2, ...)}`. `F = sigma (: C :)`, where `C` is the semi-algebra of fd events, where an event is a fd event in `EE n_0 < oo and B in RR^(n_0) “st” A = {omega: omega = (x_1, x_2, ..., x_(n_0)), (x_1, ..., x_(n_0)) in B}, aka a finite dimensional cylinder set.Use the given fd distribution to define `P` on `C`. Then apply extension theorem, checking the conditions on `(C, P)`.
Take `(Omega = RR^NN, F = sigma(: C :), P)), X_n(omega) = x_n if (omega = (x_1, ...))`. Then `{X_n(omega); n=1,...}` is a stochastic process on `(Omega, F, P)` and has fdds `Q_NN`.
`A != O/`
Let `Q_A = ` family of fdds satistfying consistency theorems. Want to construct a ps `(Omega, F, P)` and a family of random variables `{X_alpha: alpha in A}` with `Q_A` as its fdds.
- A subset `D sub RR^A` is a finite dimensional cylinder set if there exists a finite subset `A_1 sub A`, `A_1 = {alpha_1, ..., alpha_k}` and a Borel set `B in B(RR^k)` st `C = {f: f in RR^A and (f(alpha_1),..., f(alpha_k)) in B}`. `B` is called a base for `C`. `C` is an algebra, and if `A` is finite, then `C` is a `sigma`-algebra.
- `R^A = sigma(: C :)`, is called the product algebra on `RR^A`.
- A projection map `pi_(alpha_1, ..., alpha_k)(f) = (f(alpha_1), ..., f(alpha_k)): RR^A -> RR_k` and for `alpha in A` `pi_alpha(f) = f(alpha)` is called a coordinate map. Projection and coordinate maps are measurable.
Let `Omega = RR^A`, and `F = R^A`. Define a set function `P(D) = mu_(alpha_1, ..., alpha_k)(B)` for a `D in C` with representation `D = {omega: omega in RR^A, (omega(alpha_1), ..., omega(alpha_k)) in B}`. Now show that `P(D)` is indepedent of the representation of `D`, and is countably additive of `C`. Then by the extension procedure, there exists a unique extension of `P` to `F` st. `(Omega, F, P)` is a ps. Define `X_alpha(omega) = pi_alpha(omega) = omega(alpha)` for `alpha in A` yields a stochastic process with fdds `Q_A`.