<< Chapter < Page | Chapter >> Page > |
In the (approximate) Poisson process with mean $\lambda $ , we have seen that the waiting time until the first change has an exponential distribution . Let now W denote the waiting time until the $\alpha $ th change occurs and let find the distribution of W . The distribution function of W ,when $w\ge 0$ is given by
$$\begin{array}{l}F\left(w\right)=P\left(W\le w\right)=1-P\left(W>w\right)=1-P\left(fewer\_than\_\alpha \_changes\_occur\_in\_\left[\mathrm{0,}w\right]\right)\\ =1-{\displaystyle \sum _{k=0}^{\alpha -1}\frac{{\left(\lambda w\right)}^{k}{e}^{-\lambda w}}{k!}},\end{array}$$
since the number of changes in the interval $\left[\mathrm{0,}w\right]$ has a Poisson distribution with mean $\lambda w$ . Because W is a continuous-type random variable, $F\text{'}\left(w\right)$ is equal to the p.d.f. of W whenever this derivative exists. We have, provided w >0, that
$$\begin{array}{l}F\text{'}\left(w\right)=\lambda {e}^{-\lambda w}-{e}^{-\lambda w}{\displaystyle \sum _{k=1}^{\alpha -1}\left[\frac{k{\left(\lambda w\right)}^{k-1}\lambda}{k!}-\frac{{\left(\lambda w\right)}^{k}\lambda}{k!}\right]}=\lambda {e}^{-\lambda w}-{e}^{-\lambda w}\left[\lambda -\frac{\lambda {\left(\lambda w\right)}^{\alpha -1}}{\left(\alpha -1\right)!}\right]\\ =\frac{\lambda {\left(\lambda w\right)}^{\alpha -1}}{\left(\alpha -1\right)!}{e}^{-\lambda w}.\end{array}$$
This integral is positive for $0<t$ , because the integrand id positive. Values of it are often given in a table of integrals. If $t>1$ , integration of gamma fnction of t by parts yields
$$\Gamma \left(t\right)={\left[-{y}^{t-1}{e}^{-y}\right]}_{0}^{\infty}+{\displaystyle \underset{0}{\overset{\infty}{\int}}\left(t-1\right){y}^{t-2}{e}^{-y}dy}=\left(t-1\right){\displaystyle \underset{0}{\overset{\infty}{\int}}{y}^{t-2}{e}^{-y}dy=}\left(t-1\right)\Gamma \left(t-1\right).$$
Let $\Gamma \left(6\right)=5\Gamma \left(5\right)$ and $\Gamma \left(3\right)=2\Gamma \left(2\right)=\left(2\right)\left(1\right)\Gamma \left(1\right)$ . Whenever $t=n$ , a positive integer, we have, be repeated application of $\Gamma \left(t\right)=\left(t-1\right)\Gamma \left(t-1\right)$ , that $\Gamma \left(n\right)=\left(n-1\right)\Gamma \left(n-1\right)=\left(n-1\right)\left(n-2\right)\mathrm{...}\left(2\right)\left(1\right)\Gamma \left(1\right).$
However, $$\Gamma \left(1\right)={\displaystyle \underset{0}{\overset{\infty}{\int}}{e}^{-y}dy=1}.$$
Thus when n is a positive integer, we have that $\Gamma \left(n\right)=\left(n-1\right)!$ ; and, for this reason, the gamma is called the generalized factorial .
Incidentally, $\Gamma \left(1\right)$ corresponds to 0!, and we have noted that $\Gamma \left(1\right)=1$ , which is consistent with earlier discussions.
The random variable x has a gamma distribution if its p.d.f. is defined by
Hence, w , the waiting time until the $\alpha $ th change in a Poisson process, has a gamma distribution with parameters $\alpha $ and $\theta =1/\lambda $ .
Function $f\left(x\right)$ actually has the properties of a p.d.f., because $f\left(x\right)\ge 0$ and
$$\underset{-\infty}{\overset{\infty}{\int}}f\left(x\right)dx={\displaystyle \underset{0}{\overset{\infty}{\int}}\frac{{x}^{\alpha -1}{e}^{-x/\theta}}{\Gamma \left(\alpha \right){\theta}^{\alpha}}}}dx,$$ which, by the change of variables $y=x/\theta $ equals
$$\underset{0}{\overset{\infty}{\int}}\frac{{\left(\theta y\right)}^{\alpha -1}{e}^{-y}}{\Gamma \left(\alpha \right){\theta}^{\alpha}}}\theta dy=\frac{1}{\Gamma \left(\alpha \right)}{\displaystyle \underset{0}{\overset{\infty}{\int}}{y}^{\alpha -1}{e}^{-y}dy}=\frac{\Gamma \left(\alpha \right)}{\Gamma \left(\alpha \right)}=1.$$
The mean and variance are: $\mu =\alpha \theta $ and ${\sigma}^{2}=\alpha {\theta}^{2}$ .
Suppose that an average of 30 customers per hour arrive at a shop in accordance with Poisson process. That is, if a minute is our unit, then $\lambda =1/2$ . What is the probability that the shopkeeper will wait more than 5 minutes before both of the first two customers arrive? If X denotes the waiting time in minutes until the second customer arrives, then X has a gamma distribution with $\alpha =\mathrm{2,}\theta =1/\lambda =2.$ Hence,
$$p\left(X>5\right)={\displaystyle \underset{5}{\overset{\infty}{\int}}\frac{{x}^{2-1}{e}^{-x/2}}{\Gamma \left(2\right){2}^{2}}dx=}{\displaystyle \underset{5}{\overset{\infty}{\int}}\frac{x{e}^{-x/2}}{4}dx=}\frac{1}{4}{\left[\left(-2\right)x{e}^{-x/2}-4{e}^{-x/2}\right]}_{5}^{\infty}=\frac{7}{2}{e}^{-5/2}=\mathrm{0.287.}$$
We could also have used equation with $\lambda =1/\theta $ , because $\alpha $ is an integer $$P\left(X>x\right)={\displaystyle \sum _{k=0}^{\alpha -1}\frac{{\left(x/\theta \right)}^{k}{e}^{-x/\theta}}{k!}}.$$ Thus, with x =5, $\alpha $ =2, and $\theta =2$ , this is equal to
$$P\left(X>x\right)={\displaystyle \sum _{k=0}^{2-1}\frac{{\left(5/2\right)}^{k}{e}^{-5/2}}{k!}={e}^{-5/2}\left(1+\frac{5}{2}\right)=\left(\frac{7}{2}\right){e}^{-5/2}}.$$
Let now consider the special case of the gamma distribution that plays an important role in statistics.
The mean and the variance of this chi-square distributions are
$$\mu =\alpha \theta =\left(\frac{r}{2}\right)2=r$$ and $${\sigma}^{2}=\alpha {\theta}^{2}=\left(\frac{r}{2}\right){2}^{2}=2r.$$
That is, the mean equals the number of degrees of freedom and the variance equals twice the number of degrees of freedom.
In the fugure 2 the graphs of chi-square p.d.f. for r =2,3,5, and 8 are given.
Because the chi-square distribution is so important in applications, tables have been prepared giving the values of the distribution function for selected value of r and x ,
Let X have a chi-square distribution with r =5 degrees of freedom. Then, using tabularized values,
$$P\left(1.145\le X\le 12.83\right)=F\left(12.83\right)-F\left(1.145\right)=0.975-0.050=0.925$$
and $$P\left(X>15.09\right)=1-F\left(15.09\right)=1-0.99=\mathrm{0.01.}$$
If X is ${\chi}^{2}\left(7\right)$ , two constants, a and b , such that $P\left(a<X<b\right)=0.95$ , are a =1.690 and b =16.01.
Other constants a and b can be found, this above are only restricted in choices by the limited table.
Probabilities like that in Example 4 are so important in statistical applications that one uses special symbols for a and b . Let $\alpha $ be a positive probability (that is usually less than 0.5) and let X have a chi-square distribution with r degrees of freedom. Then ${\chi}_{\alpha}^{2}\left(r\right)$ is a number such that $P\left[X\ge {\chi}_{\alpha}^{2}\left(r\right)\right]=\alpha $
That is, ${\chi}_{\alpha}^{2}\left(r\right)$ is the 100(1- $\alpha $ ) percentile (or upper 100a percent point) of the chi-square distribution with r degrees of freedom. Then the 100 $\alpha $ percentile is the number ${\chi}_{1-\alpha}^{2}\left(r\right)$ such that $P\left[X\le {\chi}_{1-\alpha}^{2}\left(r\right)\right]=\alpha $ . This is, the probability to the right of ${\chi}_{1-\alpha}^{2}\left(r\right)$ is 1- $\alpha $ . SEE fugure 3 .
Let X have a chi-square distribution with seven degrees of freedom. Then, using tabularized values, ${\chi}_{0.05}^{2}\left(7\right)=14.07$ and ${\chi}_{0.95}^{2}\left(7\right)=\mathrm{2.167.}$ These are the points that are indicated on Figure 3.
Notification Switch
Would you like to follow the 'Introduction to statistics' conversation and receive update notifications?