Chapter 2: Problem 1
Show that \(L(t)=\log t\) is slowly varying but \(t^{\epsilon}\) is not if \(\epsilon \neq 0\)
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 2: Problem 1
Show that \(L(t)=\log t\) is slowly varying but \(t^{\epsilon}\) is not if \(\epsilon \neq 0\)
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(X_{1}, X_{2}, \ldots\) be i.i.d. with \(E X_{i}=0,0<\operatorname{var}\left(X_{i}\right)<\infty\), and let \(S_{n}=\) \(X_{1}+\cdots+X_{n}\). (a) Use the central limit theorem and Kolmogorov's zero- one law to conclude that limsup \(S_{n} / \sqrt{n}=\infty\) a.s. (b) Use an argument by contradiction to show that \(S_{n} / \sqrt{n}\) does not converge in probability. Hint: Consider \(n=m !\).
Let \(X_{1}, X_{2}, \ldots\) be i.i.d. with characteristic function \(\varphi\). (i) If \(\varphi^{\prime}(0)=i a\) and \(S_{n}=X_{1}+\cdots+X_{n}\) then \(S_{n} / n \rightarrow a\) in probability. (ii) If \(S_{n} / n \rightarrow a\) in probability then \(\varphi(t / n)^{n} \rightarrow e^{i a t}\). Use this to conclude that \(\varphi^{\prime}(0)=i a\), so the weak law holds if and only if \(\varphi^{\prime}(0)\) exists. This is due to E.J.G. Pitman (1956). The last exercise in combination with Exercise \(5.4\) from Chapter 1 shows that \(\varphi^{\prime}(0)\) may exist when \(E|X|=\infty\).
If \(X_{1}, X_{2}, \ldots\) are independent and have characteristic function \(\exp \left(-|t|^{\alpha}\right)\) then \(\left(X_{1}+\cdots+X_{n}\right) / n^{1 / \alpha}\) has the same distribution as \(X_{1}\).
Suppose \(X_{n} \Rightarrow X, Y_{n} \geq 0\), and \(Y_{n} \Rightarrow c\), where \(c>0\) is a constant then \(X_{n} Y_{n} \Rightarrow c X\). This result is true without the assumptions \(Y_{n} \geq 0\) and \(\mathrm{c}>0\). We have imposed these only to make the proof less tedious.
If \(X_{i}, i=1,2\) are independent and have normal distributions with mean 0 and variance \(\sigma_{i}^{2}\), then \(X_{1}+X_{2}\) has a normal distribution with mean 0 and variance \(\sigma_{1}^{2}+\sigma_{2}^{2}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.