42 3. PROBABILITY DISTRIBUTIONS k BASIC CONCEPTS 3.23. Moment Generating, and Characteristic Functions The usefulness of moments partly stems from the fact that knowledge of them determines the form of the density function. Formally, if the moments jx'n of a random variable x exist and the series Y^rn (3.10) converges absolutely for some r > 0, then the set of moments n'n uniquely determines the density function. There are exceptions to this statement, but fortunately it is true for all the distributions commonly met in physical science. In practice, knowledge of the first few moments essentially determines the general characteristics of the distribution and so it is worthwhile to construct a method that gives a representation of all the moments. Such a function is called a moment generating function (mgf) and is defined by Mx(t) = E^}. (3.11) For a discrete random variable x, this is Mx(f) = £V'/(x), (3.12a) and for a continuous variable, Mx{t) = f+ extf(x)dx. (3.12b) J-00 The moments may be generated from (3.11) by first expanding the exponential, 1 -v2 1 + xt + - {xtf + Mx(t) = E then differentiating n times and setting t = 0, that is: n=0 , dnMx(t) dtn t=o (3.13) For example, setting n = 0 and n — 1, gives ix'Q = 1 and fi[ = (i. Also, since the mgf about any point X is Mx(t) = E[exp{(x - X)t}], then if X = \i, Mtl{t)=e~(dMx{t). (3.14) An important use of the mgf is to compare two density functions f(x) and g(x). If two random variables possess mgfs that are equal for some interval symmetric about the origin, then f(x) and g(x) are identical density functions. It is also straightforward to show that the mgf of a sum of independent random variables is equal to the product of their individual mgfs. 3.2. SINGLE VARIATES 43 It is sometimes convenient to consider, instead of the mgf, its logarithm. The Taylor expan-n3 for this quantity is lnMx(t) =Klt + K2-+---, where % is the cumulant of order n, and Kr dn\nMx(t) dtn t=o Cumulants are simply related to the central moments of the distribution, the first few relations being Ki = fH (i = 1, 2, 3), k4 = /i4 - 3^ • For some distributions the integral defining the mgf may not exist and in these circumstances the Fourier transform of the density function, defined as /+°° eitxf(x)dx = Mx(it), - 00 (3.15) may be used. In statistics, x(t) is called the characteristic function (cf). The density function is then obtainable by the Fourier transform theorem (known in this context as the inversion theorem): + 00 e~itxx{t)&t. (3.16) The cf obeys theorems analogous to those obeyed by the mgf, that is: (a) if two random variables possess cfs that are equal for some interval symmetric about the origin then they have identical density functions; and (b) the cf of a sum of independent random variables is equal to the product of their individual cfs. The converse of (b) is however untrue. EXAMPLE 3.5 Find the moment generating function of the density function used in Example 3.2 and calculate the three moments p!v fj^, and jx'^. Using definition (3.12b), Mx(t) / extf(x)dx = - / ^xVM* = £ / e~x^x2dxf Jo 2 Jo 2 J0 3Some essential mathematics is reviewed briefly in Appendix A. 44 3. PROBABILITY DISTRIBUTIONS I: BASIC CONCEPTS which integrating by parts gives: f e"*(1-f) ,i 1 00 1 I 2(1 -03 ' Jo ^"O3 Then, using (3.13), the first three moments of the distribution are found to be n\ =3, jj!2 = 12, ^3 = 60. < x < 0 otherwise ' EXAMPLE 3.6 (a) Find the characteristic function of the density function: ur\ _ / 2x/a2 a TW ~ \ 0 ot and (b) the density function corresponding to a characteristic function e~^. (a) From (3.15), x{t) = (it)' (itx-l) J2t2 eita(ita-\) + \ (b) From the inversion theorem, f(x)=±.