Multivariate distributions#
Joint distributions and the independence of general random variables are defined. Joint continuous distributions are defined, and analogues of the results for multivariate discrete random variables are given for the continuous case.
Joint distributions#
We are often interested in the values taken by two random variables \(X\) and \(Y,\) defined on the same probability space \((\Omega, \mathcal{F}, \mathbb{P}).\) The joint distribution function describes the probability of the outcome that \(X\) and \(Y\) assume some values simultaneously.
(Joint distribution function)
Given random variables \(X, Y\) on \((\Omega, \mathcal{F}, \mathbb{P})\), their joint distribution function is the mapping \(F_{X, Y} : \mathbb{R}^2 \to [0, 1]\) given by
This definition can be extended to joint distributions of any number of variables, by adding more variables to the set being measured. The joint distribution function satisfies:
The joint distribution function \(F_{X, Y}\) is related to its marginal distributions \(F_X(x), F_{Y}(y)\) by:
We are often interested in how two random variables are related. In the special case where they are unrelated, we call them independent.
(Independence of variables)
We say that two random variables \(X\) and \(Y\) are independent if for all \(x, y \in \mathbb{R}\), the events \(\{X \leq x\}\) and \(\{Y \leq y\}\) are independent.
Note that previously we had defined independence between events, as well as between discrete random variables. This definition extends independence to all random variables (discrete, continuous or other).
Independence and sums#
The manipulation of a joint distribution may simplify considerably if the variables are independent. As with discrete random variables, two variables are independent if and only if the joint distribution of continuous random variables factorises.
\(\iff\) pdf factorises)
(IndependenceTwo jointly continuous random variables \(X\) and \(Y\) are independent if and only if their joint density function may be expressed in the form
Again, much like with discrete random variables, the sum of two independent continuous random variables has pmf equal to the convolution of the pmfs of the summands.
(Convolution formula)
If the random variables \(X\) and \(Y\) are independent and continuous, with pdfs \(f_X\) and \(f_Y\), then the density function of their sum \(Z = X + Y\) is
Changes of variables#
Given random variables \(X, Y\), we are often interested in the distribution of \(T(X, Y)\). If the random variables are continuous, and the function \(T\) is a bijection, then the pmf of \(T\) is given by the result below.
(Jacobian formula)
Let \(X\) and \(Y\) be jointly continuous with pdf \(f_{X, Y}\) and \(B(x, y) = (u(x, y), v(x, y))\) is a bijection from \(D = \{(x, y) : f_{X, Y}(x, y) > 0\}\) to \(S \subseteq \mathbb{R}^2\). Then the pair \((U, V) = (u(X, Y), v(X, Y))\) is jointly continuous with joint pdf
where \(J(u, v)\) is the Jacobian of \(B\)
This result extends the single-variable analogue for \(Y = g(X(\omega)),\) where \(g\) is an invertible mapping:
It can be extended to more random variables by adding further variables to the Jacobian.
Conditional density functions#
We are often interested in the distribution of one variable \(Y\), conditioned on another event \(\{X = x\}\), defined analogously to its discrete counterpart.
(Conditional density function)
The conditional density function of \(Y\) given \(X = x\) is written \(f_{Y | X}(\cdot | x)\) and defined by
for \(y \in \mathbb{R}\) and \(f_X(x) > 0\).
Expectations#
The law of the subconscious statistician for discrete random variables has the following counterpart for continuous random variables.
(Law of the subconscious statistician)
For any jointly continuous random variables \(X, Y\) with pmf \(f_{X, Y}\) and well-behaved \(g\), we have
whenever this integral converges absolutely.
The above result is useful because we need not worry about evaluating the distribution of \(Z = g(X, Y)\), and can instead evaluate the integral directly.
We also have the following result relating the independence of random variables to the factorisation of expectations of products of functions of the variables. This is again analogous to the similar result for discrete distributions.
\(\iff\) expectations of products of functions factorise)
(IndependenceJointly continuous random variables \(X\) and \(Y\) are independent if and only if
for all functions \(g, h : \mathbb{R} \to \mathbb{R}\) for which these expectations exist.
The conditional expectation of continuous random variables is defined analogously to that for discrete distributions, as shown below.
(Continuous conditional expectation)
If \(X, Y\) are jointly continuous random variables with joint density function \(f_{X, Y}\), the conditional expectation of \(Y\) given \(X = x\) is defined as
valid for any \(x\) for which \(f_X(x) > 0\).
We also have the following result about conditional expectations, an analogue of the equivalent result for discrete functions.
(Law of iterated expectations)
If \(X, Y\) are jointly continuous random variables, then
where the integral is over all \(x\) for which \(f_X(x) > 0\).