Chapter 2: Problem 60
Calculate the moment generating function of the uniform distribution on \((0,1) .\) Obtain \(E[X]\) and \(\operatorname{Var}[X]\) by differentiating.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 2: Problem 60
Calculate the moment generating function of the uniform distribution on \((0,1) .\) Obtain \(E[X]\) and \(\operatorname{Var}[X]\) by differentiating.
All the tools & learning materials you need for study success - in one app.
Get started for free
A coin, having probability \(p\) of landing heads, is flipped until head appears for the \(r\) th time. Let \(N\) denote the number of flips required. Calculate \(E[N]\). Hint: There is an easy way of doing this. It involves writing \(N\) as the sum of \(r\) geometric random variables.
Show that the sum of independent identically distributed exponential random variables has a gamma distribution.
Suppose \(X\) has a binomial distribution with parameters 6 and \(\frac{1}{2}\). Show that \(X=3\) is the most likely outcome.
Suppose that \(X\) and \(Y\) are independent binomial random variables with parameters \((n, p)\) and \((m, p) .\) Argue probabilistically (no computations necessary) that \(X+Y\) is binomial with parameters \((n+m, p)\).
Show that $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2} $$ Hint: Let \(X_{n}\) be Poisson with mean \(n\). Use the central limit theorem to show that \(P\left\\{X_{n} \leqslant n\right\\} \rightarrow \frac{1}{2}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.