<< Chapter < Page | Chapter >> Page > |
In those cases where a probability density for the parameters cannot be assigned, the model evaluation problem can be solvedin several ways; the methods used depend on the form of the likelihood ratio and the way in which the parameter(s) enter theproblem. In the Gaussian problem we have discussed so often, the threshold used in the likelihood ratio test $$ may be unity. In this case, examination of the resulting computations required reveals that implementing the test does not require knowledge of the variance of the observations (see this problem ). Thus, if the common variance of the underlying Gaussian distributions is not known,this lack of knowledge has no effect on the optimum decision rule. This happy situation - knowledge of thevalue of a parameter is not required by the optimum decision rule - occurs rarely, but should be checked before using morecomplicated procedures.
A second fortuitous situation occurs when the sufficient statistic as well as its probability density under one of themodels do not depend on the unknown parameter(s). Although the sufficient statistic's threshold $$ expressed in terms of the likelihood ratio's threshold $$ depends on the unknown parameters, $$ may be computed as a single value using the Neyman-Pearson criterion if the computation of the false-alarm probability does not involve the unknown parameters .
Continuing the example of the previous section , let's consider the situation where the value of the mean of each observationunder model ${}_{1}$ is not known. The sufficient statistic is the sum of the observations (that quantity doesn't depend on $m$ ) and the distribution of the observation vector under model ${}_{0}$ does not depend on $m$ (allowing computation of the false-alarm probability). However, a subtlety emerges; in the derivation of thesufficient statistic, we had to divide by the value of the mean. The critical step occurs once the logarithm of thelikelihood ratio is manipulated to obtain $$m\sum_{l=0}^{L-1} {r}_{l}\underset{{}_{0}}{\overset{{}_{1}}{}}(^{2}\ln +\frac{Lm^{2}}{2})$$ Recall that only positively monotonic transformations can be applied; if a negatively monotonicoperation is applied to this inequality (such as multiplying both sides by -1), the inequality reverses . If the sign of $m$ is known, it can be taken into account explicitly and a sufficient statistic results. If, however, the sign is notknown, the above expression cannot be manipulated further and the left side constitutes the sufficient statistic for thisproblem. The sufficient statistic then depends on the unknown parameter and we cannot develop a decision rule in this case.If the sign is known, we can proceed. Assuming the sign of $m$ is positive, the sufficient statistic is the sum of the observations and the threshold $$ is found by $$=\sqrt{L}Q^{(-1)}({P}_{F})$$ Note that if the variance $^{2}$ instead of the mean were unknown, we could not compute the threshold. The difficulty lies not with the sufficientstatistic (it doesn't depend on the variance), but with the false alarm probability as the expression indicates. Anotherapproach is required to deal with the unknown-variance problem.
Notification Switch
Would you like to follow the 'Signal and information processing for sonar' conversation and receive update notifications?