Tuesday, July 26, 2022

Product of two beta distributions

Theorem. Let $p, q, r > 0$. If $X \sim \operatorname{Beta}(p, q)$ and $Y \sim \operatorname{Beta}(p+q, r)$, then $$ XY \sim \operatorname{Beta}(p, q+r). $$

Proof. Let $T_p, T_q, T_r$ be independent random variables such that $T_{\alpha} \sim \operatorname{Gamma}(\alpha)$ for each $\alpha \in \{p, q, r\}$. Then it is well-known that $$ \frac{T_p}{T_p + T_q} \sim \operatorname{Beta}(p, q) \quad\text{and}\quad T_p + T_q \sim \operatorname{Gamma}(p+q). $$ Moreover, these two random variables are independent. Using this, we may realize the joint distribution of $X$ and $Y$ via $$ (X, Y) \stackrel{\text{d}}= \left( \frac{T_p}{T_p + T_q}, \frac{T_p + T_q}{T_p + T_q + T_r} \right). $$ Therefore, we conclude $$ XY \stackrel{\text{d}}= \frac{T_p}{T_p + T_q + T_r} \sim \operatorname{Gamma}(p+q+r) $$ as desired. $\square$

Corollary. Let $p, q > 0$. If $(X_n)_{n\geq 0}$ is a sequence of i.i.d. $\operatorname{Beta}(p, q)$ variables, then $$ \sum_{n=0}^{\infty} (-1)^n X_0 X_1 \cdots X_n \sim \operatorname{Beta}(p, p+q). $$ This is Problem 6524 of American Mathematical Monthly, Vol.93, No.7, Aug.–Sep. 1986.

Proof. Let $S$ denote the sum. By the alternating series estimation theorem, $0 \leq S \leq 1$ holds almost surely. Moreover, if $X\sim \operatorname{Beta}(p, q)$ is independent of $(X_n)_{n\geq 0}$, then $$ S = X_0 \left(1 - \sum_{n=0}^{\infty} (-1)^n X_1 \cdots X_{n+1} \right) \stackrel{\text{d}}= X (1 - S). $$ By noting that this equation uniquely determines all the moments of $S$, we find that this equation has a unique distributional solution by the uniqueness of the Hausdorff moment problem.

Moreover, if $S' \sim \operatorname{Beta}(p, p+q)$ is independent of $X$, then $1-S' \sim \operatorname{Beta}(p+q, p)$. So by the theorem, $$ X(1-S') \sim \operatorname{Beta}(p, p+q) \sim S'.$$ This shows that $S'$ is a distributional solution of the equation, hence by the uniquness, $S \stackrel{\text{d}}= S'$. $\square$

No comments:

Post a Comment