Discrete Distributions

Random Variables and Distributions

Recall that. for any random "experiment", the set of all possible outcomes is denoted by SS.

A random variable is a function X:SRX: S \to \R, i.e. it is a rule that associates a (real) number to every outcome of the experiment.

SS is the domain of the random variable X;X(S)RX;X(S) \subseteq R is its range.

A probability distribution function (p.d.f.) is a function f:RRf : \R → \R which specifies the probabilities of the values in X(S)X(S).

When SS is discrete, we say that XX is a discreter.v.discrete r.v. and the p.d.f. is called a probability mass function (p.m.f.).

The p.m.f. of XX is f(x)=P(sS:X(s)=x):=P(X=x)f(x) = P ({s ∈ S : X(s) = x}) := P(X = x).

The cumulative distribution function (c.d.f.) of XX is F(x)=P(Xx)F(x) = P(X ≤ x).

Expectation of a Discrete Random Variable

The expectation of a discrete random variable XX is defined as:

EX=xP(X=x)=xf(x)E|X| = \sum x\cdot P (X=x) = \sum xf(x).

This can be thought of as the sum of xx multiplied by the probability of xx.

Mean and Variance

The expectation can be interpreted as the average or the mean of XX, denoted as μ\mu.

E[(ZE[Z])2]E[(Z-E[Z])^2] can be thought of the distance from the mean or the variance denoted as Var[X]\mathrm{Var}[X].

Var[X]=E[(Xμx)2]=E[X2]μx2=E[X2]E2[X]=σx2\mathrm{Var}[X] = E[(X-\mu x)^2] = E[X^2]-\mu^2_x = E[X^2] - E^2[X] = \sigma^2_x

Standard Deviation

SD[X]=Var[X]=σx\mathrm{SD}[X] = \sqrt{\mathrm{Var}[X]} = \sigma_x

The mean gives some idea as to where the bulk of the distribution is, whereas the variance and standard deviation provide information about the spread; distributions with higher variance/SD are more spread about the average.

Binomial Distribution

Binomial coefficient: (nk)=n!(nr)r!\binom{n}{k} = \frac{n!}{(n-r)r!}.

A Bernouli trial is a random experiment with two possible outcomes, "success" and "failure". Let pp denote teh probability of success.

A binomial experiment consists of nn repeated independent Bernoulli trials, each with the same probability of success, pp.

Binomial Distribution can be defined as:

P(X=x)=px(1p)nx=(xk)px(1p)nxP(X=x) = \sum p^x(1-p)^{n-x} = \binom{x}{k}p^x(1-p)^{n-x}

Expectation, E[X]=npE[X] = np and Variance, Var[X]=np(1p)Var[X] = np(1-p)

Geometric Distribution

P(X=x)=(1p)x1p(X=x) = (1-p)^{x-1} \cdot p

E[X]=1pE[X] = \frac{1}{p} and Var[X]=1pp2\mathrm{Var}[X] = \frac{1-p}{p^2}

Negative Binomial Distribution

P(X=x)=(x1k1)pk1(1p)xkpP(X=x) = \binom{x-1}{k-1}p^{k-1}(1-p)^{x-k}\cdot p
=(x1k1)(1p)xkp= \binom{x-1}{k-1}(1-p)^{x-k}\cdot p

E[X]=kpE[X] = \frac{k}{p} and Var[X]=k(1p)p2\mathrm{Var}[X] = \frac{k(1-p)}{p^2}

Poisson Distribution

P(X=x)=λx(eλ)x!P(X=x) = \frac{\lambda ^x(e^{-\lambda})}{x!}

E[X]=λE[X] = \lambda and Var[X]=λ\mathrm{Var}[X] = \lambda

where λ\lambda is the rate of a Poisson Process.