Chapter 2: Problem 6
Let \(X_{1}, X_{2}, \ldots\) be i.i.d. with \(E X_{i}=0\) and \(E X_{i}^{2}=\sigma^{2} \in(0, \infty)\). Then $$ \sum_{m=1}^{n} X_{m} /\left(\sum_{m=1}^{n} X_{m}^{2}\right)^{1 / 2} \Rightarrow \chi $$
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 2: Problem 6
Let \(X_{1}, X_{2}, \ldots\) be i.i.d. with \(E X_{i}=0\) and \(E X_{i}^{2}=\sigma^{2} \in(0, \infty)\). Then $$ \sum_{m=1}^{n} X_{m} /\left(\sum_{m=1}^{n} X_{m}^{2}\right)^{1 / 2} \Rightarrow \chi $$
All the tools & learning materials you need for study success - in one app.
Get started for free
Suppose \(Y_{n} \geq 0, E Y_{n}^{\alpha} \rightarrow 1\) and \(E Y_{n}^{\beta} \rightarrow 1\) for some \(0<\alpha<\beta\). Show that \(Y_{n} \rightarrow 1\) in probability.
Show that \(\exp \left(-|t|^{\alpha}\right)\) is a characteristic function for \(0<\alpha \leq 1\).
Show that the distribution of a bounded r.v. \(Z\) is infinitely divisible if and only if \(Z\) is constant. Hint: Show \(\operatorname{var}(Z)=0\).
A distribution \(F\) is said to have a density \(f\) if $$ F\left(x_{1}, \ldots, x_{k}\right)=\int_{-\infty}^{x_{1}} \cdots \int_{-\infty}^{x_{k}} f(y) d y_{k} \ldots d y_{1} $$ Show that if \(f\) is continuous, \(\partial^{k} F / \partial x_{1} \ldots \partial x_{k}=f\).
Suppose \(P\left(X_{i}=1\right)=P\left(X_{i}=-1\right)=1 / 2\). Show that if \(a \in(0,1)\) $$ \frac{1}{2 n} \log P\left(S_{2 n} \geq 2 n a\right) \rightarrow-\gamma(a) $$ where \(\gamma(a)=\frac{1}{2}\\{(1+a) \log (1+a)+(1-a) \log (1-a)\\}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.