<<Up     Contents

Probability-generating function

In probability theory, the probability-generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability-generating functions are often employed for their succinct description of the sequence of probabilities Pr(X = i), and to make available the well-developed theory of power series with non-negative coefficients.

Table of contents

Definition

If X is a discrete random variable taking values on some subset of the non-negative integers, {0,1, ...}, then the probability-generating function of X is defined as:

<math>G(z) = \textrm{E}(z^X) = \sum_{i=0}^{\infty}f(i)z^i,</math>
where f is the probability mass function of X. Note that the equivalent notation GX is sometimes used to distinguish between the probability-generating functions of several random variables.

Properties

Power series

Probability-generating functions obey all the rules of power series with non-negative coefficients. In particular, since G(1-) = 1 (since the probabilities must sum to one), the radius of convergence of any probability-generating function must be at least 1, by Abel's theorem for power series with non-negative coefficients. (Note that G(1-) = limz↑1G(z).)

Probabilities and expectations

The following properties allow the derivation of various basic quantities related to X:

  1. The probability mass function of X is recovered by taking derivatives of G:

                  <math>\quad f(k) = \textrm{Pr}(X = k) = \frac{G^{(k)}(0)}{k!}.</math>
     

  2. It follows from Property 1 that if we have two random variables X and Y, and GX = GY, then fX = fY. That is, if X and Y have identical probability-generating functions, then they are identically distributed.
     
  3. The expectation of X is given by

                  <math> \textrm{E}(X) = G'(1-).</math>

    More generally, the kth factorial moment, E(X(X - 1) ... (X - k + 1)), of X is given by

                  <math>\textrm{E}\left(X(X-1)\ldots(X-k+1)\right) = G^{(k)}(1-), \quad k \geq 1.</math>
     

Sums of independent random variables

Probability-generating functions are particularly useful for dealing with sums of independent random variables. If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, then if

<math>S_n = \sum_{i=1}^n X_i,</math>
the probability-generating function, GS(z), is given by
<math>G_{S_n}(z) = G_{X_1}(z)G_{X_2}(z)\ldots G_{X_n}(z).</math>
Further, suppose that N is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function GN. If the X1, X2, ..., XN are independent and identically distributed with common probability-generating function GX, then
<math>G_{S_N}(z) = G_N(G_X(z)).</math>

Examples

Related concepts

The probability-generating function is occasionally called the z-transform of the probability mass function. It is an example of a generating function of a sequence (see formal power series).

Other generating functions of random variables include the moment-generating function and the characteristic function.

wikipedia.org dumped 2003-03-17 with terodump