Distribution and density functions#
Random variables are functions from \(\Omega\) to \(\mathbb{R}\) which need not assume a countable set of values, unlike discrete random variables. Continuous random variables are a special case of random variables, whose distribution function can be written as the integral of another function, the probability density function. Some example continuous random variables are presented, as well as some basic results.
Distribution functions#
Extending the definition of a discrete random variable allows us to deal with probability spaces with uncountably many outcomes.
(Random variable)
A discrete random variable \(X\) is a function \(X : \Omega \to \mathbb{R}\) such that for every \(x \in \mathbb{R}\) we have \(\{\omega \in \Omega : X(\omega) = x\} \in \mathcal{F}\).
Whereas discrete random variables are described by a pmf, general random variables are described by distribution functions.
(Distribution function)
If \(X\) is a random variable on \((\Omega, \mathcal{F}, \mathbb{P})\), then its distribution function of \(X\) is the function \(F_X : \mathbb{R} \to [0, 1]\) defined as
The following useful properties are true of distribution functions:
If \(x \leq y\), then \(F_X(x) \leq F_X(y)\) if \(x \leq y\), since \(\{X \leq x\} \subseteq \{Y \leq y\}.\)
\(F_X(-\infty) = 0\) and \(F_X(\infty) = 1\), by the continuity of \(\mathbb{P}\) and considering \(\mathbb{P}(\emptyset) = 0, \mathbb{P}(\Omega) = 1.\)
\(F_X\) is continuous from the right, by the continuity of \(\mathbb{P}.\)
\(\mathbb{P}(a < X \leq b) = \mathbb{P}(X \leq b) - \mathbb{P}(X \leq a).\)
Continuous random variables#
Just like discrete random variables are special case of random variables, continuous random variables are another special case of interest.
(Continuous random variable)
A random variable \(X\) is continuous if and only if its distribution function \(F_X\) can be written in the form
for some non-negative function \(f_X\), called the probability density function (pdf) of \(X\).
Pdfs are a rough counterpart [1] of pmfs for continuous random variables. Given a continuous random variable \(X\) and its distribution function \(F_X\) we can write its pmf as:
Further, the pmf \(f_X\) satisfies the following properties.
(Probability density function properties)
If \(X\) is a continuous random variable with density function \(f_X\), then
Uniform#
\(X\) is uniformly distributed in the interval \([a, b]\) if its density function is
If \(X\) is a uniformly distributed random variable and \(F\) is a distribution function, then the random variable \(Y(\omega) = F^{-1}(X(\omega))\) where
has distribution function \(F\). This holds because
where we used \(F(F^{-1}(x)) = x\), which holds because \(F\) is continuous from the right. This property can be used to generate random samples from any distribution, given its distribution function \(F\).
Exponential#
\(X\) is exponentially distributed with parameter \(\lambda\) if its density function is
Normal#
\(X\) is normally distributed with parameters \(\mu\) and \(\sigma^2\), if its density function is
Cauchy#
\(X\) is Cauchy distributed if its density function is
Gamma#
\(X\) is gamma distributed with parameters \(w > 0\) and \(\lambda > 0\) if its density function is
where \(\Gamma(w)\) is the gamma function defined as
Note that the exponential distribution is a special case of the gamma distribution, when \(w = 1\). More generally, if \(X_1, X_2, ..., X_w\) are exponentially distributed with parameter \(\lambda\), their sum \(X = X_1 + X_2 + ... + X_w\) is gamma distributed with parameters \(w\) and \(\lambda.\) This can be proved most easily using moment generating functions, but can also be shown using continuous convolutions. Alternatively, \(f_X\) can be obtained up to a normalisation constant, by considering that it is equal to the product of \(f_{X_1}, f_{X_2}, ..., f_{X_w}\) and a volume term which scales as \(x^{w-1}.\)
Beta#
\(X\) is beta distributed with parameters \(a, b > 0\) if it has density function
where \(B(s, t)\) is the beta function defined as
The last equality above can be shown by computing the convolution of two gamma distributions with parameters \((s, \lambda)\) and \((t, \lambda)\), to obtain a third gamma distribution with parameters \((s + t, \lambda)\) and then comparing the normalisation constants of the convolution expression, and the standard form of the gamma distribution.
Chi-squared#
\(X\) is chi-squared distributed with \(n\) degrees of freedom if it has density function
The chi-squared distribution is a special version of the gamma distribution, where \(w = n / 2\) and \(\lambda = 1/2\), which is of interest on its own. For example, if \(Z\) is normally distributed with parameters \((\mu, \sigma^2)\), then \(X = (Z - \mu)^2 / \sigma^2\) is chi-squared distributed with \(n = 1\) degrees of freedom.
Functions of random variables#
We are often interested in functions of random variables, and the distributions of values these take. Strictly speaking, if \(X\) is a random variable on \((\Omega, \mathcal{F}, \mathbb{P})\) and \(g : \mathbb{R} \to \mathbb{R}\) is any function, then \(Y(\omega) = g[X(\omega)]\) is not a variable in general, because \(\{\omega \in \Omega : Y(\omega) = y\}\) may not be in \(\mathcal{F}.\) Modulo this detail, the following result gives us a rule for determining \(f_Y.\)
(Pdf of a function of a random variable)
Let \(X\) be a continuous random variable on \((\Omega, \mathcal{F}, \mathbb{P})\) and \(g : \mathbb{R} \to \mathbb{R}\) be a continuous increasing function. Then \(Y(\omega) = g[X(\omega)]\) has pmf
A similar relation holds for continuous and decreasing \(g\), where a \(-\) sign is added in front of the right hand side of the above result. Cases where \(g\) is not strictly increasing or decreasing, are best treated on their own.
Expectations#
The expectation of a continuous random variable is the sensible analogue of the expectation of a continuous random variable.
(Expectation of continuous random variable)
If \(X\) is a continuous random variable with pdf \(f_X\), the expectation of \(X\) is defined by
whenever the integral converges absolutely.
Finally, as with discrete random variables, an analogous law of the subconscious statistician states that we can compute expectations of functions of random variables in the straightforward way.
If \(X\) is a continuous random variable with density function \(f_X\), and \(g: \mathbb{R} \to \mathbb{R}\), then
whenever the integral converges absolutely.