Kolmogorov's three-series theorem
In probability theory, Kolmogorov's Three-Series Theorem, named after Andrey Kolmogorov, gives a criterion for the almost sure convergence of an infinite series of random variables in terms of the convergence of three different series involving properties of their probability distributions. Kolmogorov's three-series theorem, combined with Kronecker's lemma, can be used to give a relatively easy proof of the Strong Law of Large Numbers.[1]
Statement of the theorem
Let (Xn)n∈N be independent random variables. The random series ∑∞n=1Xn converges almost surely in ℝ if and only if the following conditions hold for some A > 0:
i. ∑∞n=1(|X|n ≥ A) converges
ii. Let Yn:= Xn1{|X|n ≤ A}, then ∑∞n=1E(Yn), the series of expected values of Yn , converges
iii. ∑∞n=1var(Yn) converges
Proof
Sufficiency of conditions ("if")
Condition (i) and Borel–Cantelli give that almost surely Xn = Yn for n large, hence ∑∞n=1Xn converges if and only if ∑∞n=1Yn converges. Conditions (ii)-(iii) and Kolmogorov's Two-Series Theorem give the almost sure convergence of ∑∞n=1Yn.
Necessity of conditions ("only if")
Suppose that ∑∞n=1Xn converges almost surely.
Without condition (i), by Borel–Cantelli there would exist some A > 0 such that almost surely {|Xn| ≥ A} for infinitely many values n, but then the series would diverge. Therefore, we must have condition (i).
We see that condition (iii) implies condition (ii): Kolmogorov's two-series theorem along with condition (i) applied to the case A = 1 gives the convergence of ∑n(Yn − 𝔼(Yn)). So given the convergence of ∑∞n=1Yn, we have ∑∞n=1𝔼(Yn) converges, so condition (ii) is implied.
Thus, it only remains to demonstrate the necessity of condition (iii), and we will have obtained the full result. It is equivalent to check condition (iii) for the series ∑∞n=1Zn = ∑∞n=1(Yn − Y'n) where for each n, Yn and Y'n are IID—that is, to employ the assumption that 𝔼(Yn) = 0, since Zn is a sequence of random variables bounded by 2, converging almost surely, and with 𝕍ar(Zn) = 2𝕍ar(Yn). So we wish to check that if ∑∞n=1Zn converges, ∑∞n=1𝕍ar(Zn) converges as well. This is a special case of a more general result from martingale theory with summands equal to the increments of a martingale sequence and the same conditions (𝔼(Zn) = 0, series of the variances converging, summands bounded).[2][3][4]
Example
As an illustration of the theorem, consider the example of the harmonic series with random signs:
Here, "" means that each term is taken with a random sign that is either or with respective probabilities , and all random signs are chosen independently. Letting in the theorem denote a random variable that takes the values and with equal probabilities, one can check easily that the conditions of the theorem are satisfied, so it follows that the harmonic series with random signs converges almost surely. On the other hand, the analogous series of (for example) square root reciprocals with random signs, namely
diverges almost surely, since condition (3) in the theorem is not satisfied. Note that this is different from the behavior of the analogous series with alternating signs, , which does converge.
Notes
- ↑ Durrett, Rick. "Probability: Theory and Examples." Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Section 1.8, pp. 60–69.
- ↑ Sun, Rongfeng. Lecture notes. http://www.math.nus.edu.sg/~matsr/ProbI/Lecture4.pdf
- ↑ M. Loève, "Probability theory", Princeton Univ. Press (1963) pp. Sect. 16.3
- ↑ W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) pp. Sect. IX.9