Examples of tests Brno 2007 1 Tests on parameters of normal distribution Let Xi,..., Xn be a sample from W(, G2 ) with the density (2^)-^eXp ( - g ) exp ( - ^ *? + Ji **) ˇ e = --^, $ = % U(x) = Y,xlT(x)=x. 1.1 Tests on a2 Denote f) = - 2(7 Then the statistic V = J2(Xi -X)2 = U - nT2 is ancillary for 9 and is increasing in U. Then Hi : a < Go is equivalent to 6 < do, and the UMP unbiased test rejects Hi if J2(Xi -- X)2 > Ca, where oo xl-i(y)dy = a. Consider the hypothesis H2 : a = GQ. Because V is linear in U, the UMP unbiased test rejects H2 provided EG** - %)2 Co where the constants are determined so that C2 ^ eGi xl-i(y)dy = -- 7 / yxl-i(y)dy = i - a. c-i n -- L j C l On the other hand, 1 2 1 \ 1 "~1 --* 1 "+1 1 --Ä 2 1 \ -- Co, where ^Jnx \J^iE(^-xy Consider the hypothesis H4 : = 0 = 0 and the statistic Then W is also independent of T under = 0 and it is linear in U = X. Under = 0, the distribution of W is symmetric around 0. Thus, the UMP unbiased test of H4 would reject it for \W\ > C, where P(,=o{W > C) = a. Because = y/(n-l)nW{x) ^l-nW2 {x) ' then \t(x)\ is an increasing function of |VF(a;)|, and the rejection region of the UMP unbiased test is equivalent to \t(x)\>C and "~ a f°° tn-i(y)dy = - , / tn-i(y)dy = a. C z JC0 The tests of more general hypotheses with 0 / 0 would have the criterion /n(x - 0) t{x) = ^ = T E ( Z * - Ž ) 2 2 Comparing the variances of two normal distributions Let X = (Xi,..., Xm) and Y = (Yi,..., Yn) be samples from the normal distributions W(, c2 ) and Af(r], r2 ). Then the joint density of (X, Y) is ^ A N ( 1 v ^ 2 1 v ^ 2 m í - nri \ C(,r],a,r)exp ( ^ - ^ ^ x . - -- ^yj + --x + --yj . Consider the reparametrization 3 and the sufficient statistics We want to test the hypothesis H5 : r2 < Aom_i(y)dy = a. 0 0 IC0 If we want to test the hypothesis He : T2 = Ao C2 and Ci, C2 are determined through the relations f C 2 rC2 / ďn-l m-l(w)dW = / Bn+1 m-1 (w)ďW = 1 -- (X 7Ci 2 ' 2 7Ci 2 ' 2 The relation between the beta and F-distribution is such that W = j^y, where ~_l has the distribution Fra_i;TO_i. 3 Comparing the means of two normal distributions If we want to compare and r\ and the variances a2 are r2 are unknown and unequal, then it is the Behrens-Fisher problem, that cannot be treated in similar way. Thus suppose that a = r. Then the joint density of (X, Y) is C(C, rj, *2 + E^) + ~2 E ^ + ^ E ^ 4 Reparamet rization: <7Z <7Z 2<7Z the sufficient statistics: Consider the hypothesis H7 : r\ -- < 0. The it is better to reparametrize i)-j 0*= rnj + nr] + l)a2 ' l (m + n)<72! with the sufficient statistics U*=Y-X, T?=mX + nY, T2* = J^x ? + E F / When rj = , then we have the statistic E(x, - x)2 + (i- - ^)2 VT 2 - ^T i*2 - Sf7 *2 that does not depend on = r? and on Co, where t(X,Y) = Y-X/\/ + ' w m n [UXi - Xy + UYj - Y)2 ]/(m + n - 2) and J~ tm+n-2(y)dy = a. For the hypothesis Hg : rj -- = 0, we should consider the statistics (linear in Č7* with coefficients dependent only on T*, with distribution symmetric around 0) w= Y -x Y,*? + Y.Y,2 - s ; ( E i + Y,Yj)2 that is related to V through v = w ran m-\-n w2 Hence, we reject Hg provided \t(X, Y)\ > C, and J^ tm+n-2(y)dy = f. 3.1 Test of independence in bivariate normal distribution Let (Xi, Yi),..., (Xn, Yn) be a sample from the bivariate normal distribution with density Eo* -^-^ Eo* -*)& -^) + i E(w - ^ 1 r l / l ^ , .,2 2p ^ . ... , l ^ . ,2 2TIUT\J\ -- p' 1exp 2(f-p2 )V l ~^ and Yi i--> -J --^ and hence under p = 9 = 0 its distribution is independent of #i,... ,#4. It is nondecreasing in U. Thus the UMP unbiased test of Hg rejects when R > Co or when R ^i^W y/(n -2)>K0 where KQ is determined so that J^° tn-2(y)dy = a. The statistic R is linear in U, hence the UMP test of H10 rejects when I fí\ ľ°° (T ^lA--V(n-2)>Kh where J tn-2(y)dy =-. 3.2 Regression We have observations (Yi,x{),..., (Yn,xn) and consider the simple regression _E[F|a;] = a + ßx. Start with the transformations Vi V52(x j - x )2 and denote 7 + övi = a + ßxi, thus Yl v % = 0, S v ľ = 1- Then x ó a = 7-S .^, ß =^(xj-x)2 ' V52(X J - x )2 The joint density of Yi,..., Yn is (2vr)ra /2 ď Reparamet rization: 1 exp 2C, where fTM tn-2(y)dy = f. 6 4 Comparison of two Poisson distributions Let X and Y be two independent variables with Poisson distributions V{\) and V{ß). Joint distribution of X, Y is e-(x+y) P{X = x,Y = y} = ---- exp x\y\ ylog- + (x + y) log A Reparametrization: 9 = log ^, # = log A, sufficient statistics U = Y, T = X + Y. Consider the hypotheses H12 : ß < A ~ 6 < 0 and H43 : /x = A ~ 6 = 0. The test is conditional for given T = t. The conditional distribution of Y given X + Y = t is P I F = y|X + r = í) = í /x A + /J.J \X + /J A í-2/ the binomial B(t, p = ^ r ^ j - The hypothesis H12 is equivalent to p < \ and is rejected when Y > C (t) and rejected with probability 7(c) when Y = C (t), where s C H U = 2*a. i=C(t)+l Similarly, the hypothesis H43 is equivalent to p = \ and is rejected when Y < C\{t) and Y > C2(i) and rejected with probability 7J(Č) when F = Cj(í); due to the symmetry it holds C\{t) -- | = |--C2(i) and 71 (í) =72(í). 5 Comparison of two binomial distributions Let X, Y be two independent variables with binomial distributions B(m,p\) and B{n,p2), respectively. Their joint distribution is >(x = x,Y = y) PÍQÍ y P2Q2 m \ n T Tb-- ti TTI Ti P2Q2 Qi 92exp V ( log -- - log -- ) + (x + y) log -- Q2 C(í) and reject with probability 7(i) when Y = C (t) where P (Y > C (ť) X + Y = t)+ 7(í)F(r = C (ť) X + Y = t) = a. 7 6 Test for independence in a 2 x 2 table Consider the population of n individuals; for each individual we check whether it has properties A and B. The results we write in the table A A sums B X X' M B Y Y' N sums Q Q' s where X is the number of individuals which have simultaneously A and B, etc. Denote pAB the probability that an individual has both A and B, and similarly we denote the probabilities PAB, VAB, VÄBMoreover, let pA and ps be the probabilities that an individual has property A, B, respectively. Then PA = PAB + PABi PB = PAB + VÄBThe joint distribution of variables X, Y, X', Y' is multinomial and is given by T)' i i P{X = x,X> = x>, Y = y,Y> = y') = - j - ^ ^ p ^ ^ ^ _ (6.1) nl , m / , VAB . /, PAB , , PAB , ,, , n(PÄB) e x P x log + x log + y log xlx'lyly'l V VAB VAB VAB We wish to test the hypothesis, that properties A, B appear independently in the population. More precisely, we wish to test the hypothesis of independence Hi : pAB =PA-PB against Ki : pAB / pA ˇ PB, or the hypothesis of positive dependence H2 : PAB >PA-PB against K2 : pAB 0. 8 The tests are conditional, based on the conditional distribution of T\ given T2, T3 under 9\ = 0. First we get Pe1=0 (X = x,Y = y\X + Xl = m) = ( m x \ ( n ~ y m \ px /y (l - PAT^ and this under condition X + Y = q gives 3 öl=o {X = x \ X + X' = m, X + Y = q) = m \ I n -- m x J \ q -- x ~7ľT The conditional test of H2 is called the Fisher-Irwin test; it has the form 1 if x < C\{m,q) &(x,m,q) = ˇ[ 7(m,(?) if x = C(m,q), 0 if C\{m,q) < x where Ci,7i,2 (Ci, integer) are determined so that Y^ I m \ I n -- m \ I m \ I n -- m \ In i \ Q-i +7l \Ci [ q-d j =a [ qí