<< Chapter < Page | Chapter >> Page > |
To derive the multivariate distribution of $W$ , we use the count statistics and the independence properties of the Poisson process. Thedensity we seek satisfies $$\int_{{w}_{1}}^{{w}_{1}+{}_{1}} \int_{{w}_{n}}^{{w}_{n}+{}_{n}} p({\mathbf{W}}^{\left(n\right)}, v)\,d v\,d v=({W}_{1}\in \left[{w}_{1} , {w}_{1}+{}_{1}\right), , {W}_{n}\in \left[{w}_{n} , {w}_{n}+{}_{n}\right))$$ The expression on the right equals the probability that noevents occur in $\left[{t}_{1} , {w}_{1}\right)$ , one event in $\left[{w}_{1} , {w}_{1}+{}_{1}\right)$ , no event in $\left[{w}_{1}+{}_{1} , {w}_{2}\right)$ , etc. Because of the independence of event occurrence in these disjoint intervals, we can multiplytogether the probability of these event occurrences, each of which is given by the count statistics. $$({W}_{1}\in \left[{w}_{1} , {w}_{1}+{}_{1}\right), , {W}_{n}\in \left[{w}_{n} , {w}_{n}+{}_{n}\right))=e^{-{}_{{t}_{1}}^{{w}_{1}}}{}_{{w}_{1}}^{{w}_{1}+{}_{1}}e^{-{}_{{w}_{1}}^{{w}_{1}+{}_{1}}}e^{-{}_{{w}_{1}+{}_{1}}^{{w}_{2}}}{}_{{w}_{2}}^{{w}_{2}+{}_{2}}e^{-{}_{{w}_{2}}^{{w}_{2}+{}_{2}}}{}_{{w}_{n}}^{{w}_{n}+{}_{n}}e^{-{}_{{w}_{n}}^{{w}_{n}+{}_{n}}}\approx \prod_{k=1}^{n} ({w}_{k}){}_{k}e^{-{}_{{t}_{1}}^{{w}_{n}}}$$ for small ${}_{k}$ . From this approximation, we find that the joint distribution ofthe first $n$ event times equals
For Poisson processes, the sample function density describes the joint distribution of counts and event times within aspecified time interval. Thus, it can be written as $$({t}_{1}\le t< {t}_{2}, {N}_{t})=(\{{W}_{1}={w}_{1}, , {W}_{n}={w}_{n}\}, {N}_{{t}_{1},{t}_{2}}=n)p({W}^{\left(n\right)}, w)$$ The second term in the product equals the distribution derived previously for the time of occurrence statistics. Theconditional probability equals the probability that no events occur between ${w}_{n}$ and ${t}_{2}$ ; from the Poisson process's count statistics, this probability equals $e^{-{}_{{w}_{n}}^{{t}_{2}}}$ . Consequently, the sample function density for the Poisson process, be it stationary or not, equals
From the probability distributions derived on the previous pages, we can discern many structural properties of thePoisson process. These properties set the stage for delineating other point processes from the Poisson. They, asdescribed subsequently, have much more structure and are much more difficult to handle analytically.
The counting process ${N}_{t}$ is an independent increment process. For a Poisson process, the number of events in disjoint intervals are statistically independent of eachother, meaning that we have an independent increment process. When the Poisson process is stationary, incrementstaken over equi-duration intervals are identically distributed as well as being statistically independent. Twoimportant results obtain from this property. First, the counting process's covariance function ${K}_{N}(t, u)$ equals $^{2}\min\{t , u\}$ . This close relation to the Wiener waveform process indicates the fundamental nature of the Poissonprocess in the world of point processes. Note, however, that the Poisson counting process is not continuous almost surely. Second, the sequence of counts forms an ergodic process, meaning wecan estimate the intensity parameter from observations.
The mean and variance of the number of events in an interval can be easily calculated from the Poisson distribution.Alternatively, we can calculate the characteristic function and evaluate its derivatives. The characteristic functionof an increment equals $${}_{{N}_{{t}_{1},{t}_{2}}}(v)=e^{(e^{iv}-1){}_{{t}_{1}}^{{t}_{2}}}$$ The first two moments and variance of an increment of the Poisson process, be it stationary or not, equal
Consider the conditional density ${p}_{{W}_{n}|{W}_{n-1},,{W}_{1}}({w}_{n}|{w}_{n-1},,{w}_{1})$ . This density equals the ratio of the event time densitiesfor the $n$ - and ( $n-1$ )-dimensional event time vectors. Simple substitution yields
Exploiting the previous property, the duration of the ${n}^{\mathrm{th}}$ interval ${}_{n}={w}_{n}-{w}_{n-1}$ does not depend on the lengths of previous (or future) intervals. Consequently, the sequence of intereventintervals forms a "white" sequence. The sequence may not be identically distributed unless the process is stationary.In the stationary case, interevent intervals are truly white - they form an IID sequence - and have an exponentialdistribution.
Here, the intensity $(t)$ equals a sample function drawn from some waveform process. In waveform processes, the analogous concept does not have nearly the impact it does here. Because intensitywaveforms must be non-negative, the intensity process must be nonzero mean and non-Gaussian. The authors shall assume throughout that the intensityprocess is stationary for simplicity. This model arises in those situations in which the event occurrence rate clearlyvaries unpredictably with time. Such processes have the property that the variance-to-mean ratio of the number ofevents in any interval exceeds one. In the process of deriving this last property, we illustrate the typical wayof analyzing doubly stochastic processes: Condition on the intensity equaling a particular sample function, use thestatistical characteristics of nonstationary Poisson processes, then "average" with respect to the intensityprocess. To calculate the expected number ${N}_{{t}_{1},{t}_{2}}$ of events in an interval, we use conditional expected values:
The approach of sample-function conditioning can also be used to derive the density of the number of events occurring in aninterval for a doubly stochastic Poisson process. Conditioned on the occurrence of a sample function, the probability of $n$ events occurring in the interval $\left[{t}_{1} , {t}_{2}\right)$ equals ( ) $$(\forall (t), {t}_{1}\le t< {t}_{2}, {N}_{{t}_{1},{t}_{2}}=n)=\frac{{}_{{t}_{1}}^{{t}_{2}}^{n}}{n!}e^{-{}_{{t}_{1}}^{{t}_{2}}}$$ Because ${}_{{t}_{1}}^{{t}_{2}}$ is a random variable, the unconditional distribution equals this conditional probability averagedwith respect to this random variable's density. This average is known as the Poisson Transform of the randomvariable's density.
Notification Switch
Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?