<< Chapter < Page | Chapter >> Page > |
As a result of the completeness of the real numbers, it is true that any fundamental sequence converges (i.e., has a limit). And such convergence has certain desirableproperties. For example the limit of a linear combination of sequences is that linear combination of the separate limits; and limits of products are the products of the limits.
The notion of convergent and fundamental sequences applies to sequences of real-valued functions with a common domain. For each x in the domain, we have a sequence
$\{{f}_{n}\left(x\right):1\le n\}$ of real numbers. The sequence may converge for some x and fail to converge for others.
A somewhat more restrictive condition (and often a more desirable one) for sequences of functions is uniform convergence. Here the uniformity is over values of the argument x . In this case, for any $\u03f5>0$ there exists an N which works for all x (or for some suitable prescribed set of x ).
These concepts may be applied to a sequence of random variables , which are real-valued functions with domain Ω and argument ω . Suppose $\{{X}_{n}:1\le n\}$ is a sequence of real random variables. For each argument ω we have a sequence $\{{X}_{n}\left(\omega \right):1\le n\}$ of real numbers. It is quite possible that such a sequence converges for some ω and diverges (fails to converge) for others. As a matter of fact, in many important cases the sequence converges for all ω except possibly a set (event) of probability zero. In this case, we say the seqeunce converges almost surely (abbreviated a.s.). The notion of uniform convergence also applies. In probability theory we have the notion of almost uniform convergence. This is the case that the sequence converges uniformly for all ω except for a set of arbitrarily small probability.
The notion of convergence in probability noted above is a quite different kind of convergence. Rather than deal with the sequence on a pointwise basis, it deals withthe random variables as such. In the case of sample average, the “closeness” to a limit is expressed in terms of the probability that the observed value ${X}_{n}\left(\omega \right)$ should lie close the the value $X\left(\omega \right)$ of the limiting random variable. We may state this precisely as follows:
A sequence $\{{X}_{n}:1\le n\}$ converges to X in probability , designated ${X}_{n}\stackrel{P}{\to}X$ iff for any $\u03f5>0$ ,
There is a corresponding notion of a sequence fundamental in probability.
The following schematic representation may help to visualize the difference between almost-sure convergence and convergence in probability. In setting up the basic probability model, we thinkin terms of “balls” drawn from a jar or box. Instead of balls, consider for each possible outcome ω a “tape” on which there is the sequence of values ${X}_{1}\left(\omega \right),{X}_{2}\left(\omega \right),{X}_{3}\left(\omega \right),\cdots $ .
It is not difficult to construct examples for which there is convergence in probability but pointwise convergence for no ω . It is easy to confuse these two types of convergence. The kind of convergence noted for the sample average is convergence inprobability (a “weak” law of large numbers). What is really desired in most cases is a.s. convergence (a “strong” law of large numbers). It turns out that for a samplingprocess of the kind used in simple statistics, the convergence of the sample average is almost sure (i.e., the strong law holds). To establishthis requires much more detailed and sophisticated analysis than we are prepared to make in this treatment.
The notion of mean convergence illustrated by the reduction of $\mathrm{Var}\phantom{\rule{0.166667em}{0ex}}\left[{A}_{n}\right]$ with increasing n may be expressed more generally and more precisely as follows. A sequence $\{{X}_{n}:1\le n\}$ converges in the mean of order p to X iff
If the order p is one, we simply say the sequence converges in the mean. For $p=2$ , we speak of mean-square convergence .
The introduction of a new type of convergence raises a number of questions.
Before sketching briefly some of the relationships between convergence types, we consider one important condition known as uniform integrability . According to the property (E9b) for integrals
Roughly speaking, to be integrable a random variable cannot be too large on too large a set. We use this characterization of the integrability of a single random variable to define thenotion of the uniform integrability of a class.
Definition . An arbitrary class $\{{X}_{t}:t\in T\}$ is uniformly integrable (abbreviated u.i.) with respect to probability measure P iff
This condition plays a key role in many aspects of theoretical probability.
The relationships between types of convergence are important. Sometimes only one kind can be established. Also, it may beeasier to establish one type which implies another of more immediate interest. We simply state informally some of the important relationships. A somewhat more detailed summaryis given in PA, Chapter 17. But for a complete treatment it is necessary to consult more advanced treatments of probability and measure.
Relationships between types of convergence for probability measures
Consider a sequence $\{{X}_{n}:1\le n\}$ of random variables.
Various chains of implication can be traced. For example
We do not develop the underlying theory. While much of it could be treated with elementary ideas, a complete treatment requires considerable development of the underlying measuretheory. However, it is important to be aware of these various types of convergence, since they are frequently utilized in advanced treatments of applied probability and of statistics.
Notification Switch
Would you like to follow the 'Applied probability' conversation and receive update notifications?