Theorem. Let p,q,r>0. If X∼Beta(p,q) and Y∼Beta(p+q,r), then XY∼Beta(p,q+r).
Proof. Let Tp,Tq,Tr be independent random variables such that Tα∼Gamma(α) for each α∈{p,q,r}. Then it is well-known that TpTp+Tq∼Beta(p,q)andTp+Tq∼Gamma(p+q). Moreover, these two random variables are independent. Using this, we may realize the joint distribution of X and Y via (X,Y)d=(TpTp+Tq,Tp+TqTp+Tq+Tr). Therefore, we conclude XYd=TpTp+Tq+Tr∼Gamma(p+q+r) as desired. ◻
Corollary. Let p,q>0. If (Xn)n≥0 is a sequence of i.i.d. Beta(p,q) variables, then ∞∑n=0(−1)nX0X1⋯Xn∼Beta(p,p+q). This is Problem 6524 of American Mathematical Monthly, Vol.93, No.7, Aug.–Sep. 1986.
Proof. Let S denote the sum. By the alternating series estimation theorem, 0≤S≤1 holds almost surely. Moreover, if X∼Beta(p,q) is independent of (Xn)n≥0, then S=X0(1−∞∑n=0(−1)nX1⋯Xn+1)d=X(1−S). By noting that this equation uniquely determines all the moments of S, we find that this equation has a unique distributional solution by the uniqueness of the Hausdorff moment problem.
Moreover, if S′∼Beta(p,p+q) is independent of X, then 1−S′∼Beta(p+q,p). So by the theorem, X(1−S′)∼Beta(p,p+q)∼S′. This shows that S′ is a distributional solution of the equation, hence by the uniquness, Sd=S′. ◻
No comments:
Post a Comment