/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 6E Suppose that \(X\)  is a rando... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that\(X\) is a random variable for which the p.d.f. or the p.f. is\(f\left( {x|\theta } \right)\) where the value of the parameter \(\theta \) is unknown but must lie in an open interval \(\Omega \). Let \({I_0}\left( \theta \right)\) denote the Fisher information in \(X\) . Suppose now that the

parameter \(\theta \) is replaced by a new parameter \(\mu \), where\(\theta = \psi \left( \mu \right)\) and\(\psi \) is a differentiable function. Let \({I_1}\left( \mu \right)\)denote the Fisher information in X when the parameter is regarded as \(\mu \). Show thatShow that\({I_1}\left( \mu \right) = {\left( {{\psi ^{'}}\left( \mu \right)} \right)^{2}}{I_0}\left( {\psi \left( \mu \right)} \right)\)

Short Answer

Expert verified

\({I_1}\left( \mu \right) = {\left({{\psi^{'}}\left( \mu \right)} \right)^{2}}{I_0} \left( {\psi \left( \mu \right)} \right)\)

Step by step solution

01

Given information

X be the random variable for which the pdf is \(f\left( {x|\theta } \right)\) where \(\theta \) is unknown but it must lie in an open interval\(\Omega \)

\({I_0}\left( \theta \right)\) be the fisher information in X.

The parameter\(\theta \)when replaced by the new parameter\(\mu \), where\(\theta = \psi \left( \mu \right)\)and

\(\psi \)is a differentiable function.

\({I_1}\left( \mu \right)\) denote the fisher information in X when the parameter is regarded as \(\mu \)

02

Fisher information

Fisher information \(I\left( \theta \right)\) in the random variable X is defined as

\(I\left( \theta \right) = {E_\theta }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\theta } \right)} \right)}^{2}}} \right\}\)

Or

\(I\left( \theta \right) = - {E_\theta }\left\{ {\left( {{\lambda ^{''}}\left( {x|\theta } \right)} \right)} \right\}\)

03

Verifying \({I_1}\left( \mu  \right) = {\left( {{\psi ^{'}}\left( \mu  \right)} \right)^{2}}{I_0}\left( {\psi \left( \mu  \right)} \right)\)

Let \(g\left( {x|\mu } \right)\) be the pdf of X when \(\mu \) is parameter.

Then \(g\left( {x|\mu } \right) = f\left( {x|\psi \left( \mu \right)} \right)\)

Hence taking log on both the sides

\(\begin{align}\log g\left( {x|\mu } \right) &= \log f\left( {x|\psi \left( \mu \right)} \right)\\ &= \lambda \left( {x|\psi \left( \mu \right)} \right)\end{align}\)

Differentiate \(\lambda \left( {x|\psi \left( \mu \right)} \right)\) with respect to \(\mu \)

\(\begin{align}{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right) &= \frac{\partial }{{\partial u}}\lambda \left( {x|\psi \left( \mu \right)} \right)\\ &= {\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right){\psi ^{'}}\left( \mu \right)\end{align}\)

Then firsher information in x when the parameter is regarded as \(\mu \) is

\(\begin{align}{I_1}\left( \mu \right) &= {E_\mu }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right)} \right)}^{2}}} \right\}\\ &= {E_\mu }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right){\psi ^{'}}\left( \mu \right)} \right)}^{2}}} \right\}\\ &= {\left( {{\psi ^{'}}\left( \mu \right)} \right)^{2}}{E_\mu }\left( {{{\left\{ {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right)} \right\}}^{2}}} \right)\end{align}\)

\({I_1}\left( \mu \right) = {\left( {\psi \left( \mu \right)} \right)^2}{I_0}\left( {\psi \left( \mu \right)} \right)\)

hence proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose thatXhas thetdistribution withmdegrees of freedom(m >2). Show that Var(X)=m/(m−2).

Hint:To evaluate\({\bf{E}}\left( {{{\bf{X}}^{\bf{2}}}} \right)\), restrict the integral to the positive half of the real line and change the variable fromxto

\({\bf{y = }}\frac{{\frac{{{{\bf{x}}^{\bf{2}}}}}{{\bf{m}}}}}{{{\bf{1 + }}\frac{{{{\bf{x}}^{\bf{2}}}}}{{\bf{m}}}}}\)

Compare the integral with the p.d.f. of a beta distribution. Alternatively, use Exercise 21 in Sec. 5.7.

When the motion of a microscopic particle in a liquid or a gas is observed, it is seen that the motion is irregular because the particle frequently collides with other particles. The probability model for this motion, which is called Brownian motion,is as follows: A coordinate system is chosen in the liquid or gas. Suppose that the particle is at the origin of this coordinate system at timet=0, and let(X, Y, Z)denote the particle's coordinates at any timet >0. The random variablesX,Y, andZare i.i.d. Each has a normal distribution with mean 0 and variance\({\sigma ^2}t\). Find the probability that at timet=2, the particle will lie within a sphere whose centre is at the origin and whose radius is 4σ.

In the situation of Example 8.5.11, suppose that we observe\({{\bf{X}}_{\bf{1}}}{\bf{ = 4}}{\bf{.7}}\;{\bf{and}}\;{{\bf{X}}_{\bf{2}}}{\bf{ = 5}}{\bf{.3}}\).

  1. Find the 50% confidence interval described in Example 8.5.11.
  2. Find the interval of possibleθvalues consistent with the observed data.
  3. Is the 50% confidence interval larger or smaller than the set of possibleθvalues?
  4. Calculate the value of the random variable\({\bf{Z = }}{{\bf{Y}}_{\bf{2}}}{\bf{ - }}{{\bf{Y}}_{\bf{1}}}\)as described in Example 8.5.11.
  5. Use Eq. (8.5.15) to compute the conditional probability that\(\left| {{{{\bf{\bar X}}}_{\bf{2}}}{\bf{ - \theta }}} \right|{\bf{ < 0}}{\bf{.1}}\)givenZ isequal to the value calculated in part (d).

For the conditions of Exercise 2, how large a random sample must be taken in order that\({\bf{P}}\left( {{\bf{|}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - \theta |}} \le {\bf{0}}{\bf{.1}}} \right) \ge {\bf{0}}{\bf{.95}}\)for every possible value ofθ?

Question:Suppose that a random variable X has the geometric distribution with an unknown parameter p. (See Sec. 5.5.).Find a statistic \({\bf{\delta }}\left( {\bf{X}} \right)\)that will be an unbiased estimator of\(\frac{{\bf{1}}}{{\bf{p}}}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.