/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q9E Use the data on fish prices in T... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Use the data on fish prices in Table 11.6 on page 707. Suppose that we assume only that the distribution of fish prices in 1970 and 1980 is a continuous joint distribution with finite variances. We are interested in the properties of the sample correlation coefficient. Construct 1000 nonparametric bootstrap samples for solving this exercise.

a. Approximate the bootstrap estimate of the variance of the sample correlation.

b. Approximate the bootstrap estimate of the bias of the sample correlation.

c. Compute simulation standard errors of each of the above bootstrap estimates.

Short Answer

Expert verified

a. 0.0741.

b. -0.9456.

c. 0.0031;

Step by step solution

01

(a) To Approximate the bootstrap estimate of the variance of the sample correlation.

The statistic of interest is the sample correlation. The date is given in one of the previous exercises and consists of n=14 observations for two years - 1970. and 1980. The goal is to create\(\nu = 1000\) bootstrap samples and to find the bootstrap estimate of the variance of the sample correlation. First, the sample correlation is

\(R = \frac{{\sum\limits_{i = 1}^n {\left( {{X_i} - \bar X} \right)} \left( {{Y_i} - \bar Y} \right)}}{{\sqrt {\sum\limits_{i = 1}^n {{{\left( {{X_i} - \bar X} \right)}^2}} \sum\limits_{i = 1}^n {{{\left( {{Y_i} - \bar Y} \right)}^2}} } }}\)

However, use the build-in function in R to compute the sample correlation. The first code generated the\(\nu = 1000\)bootstrap samples and their sample correlation. After generating the sample correlation of the bootstrap samples, the bootstrap estimate of the variance of the sample correlation is the sample variance of the generated bootstrap samples of the sample correlation.

Hence, compute\({R^{(i)}},i = 1,2, \ldots ,\nu \)and then find the variance of the\(\nu = 1000\)sample correlation. The simulation gives an approximate value of 0.0741. This is the bootstrap estimate of the variance of the sample correlation.

\#read data

Users

Exercise.txt",

\(header = TRUE,\left. {sep = m,dec = {L^{\prime \prime }}} \right)\)

\(\begin{aligned}{l}Year1970 = matrix(unlist(data(1)))\\Year1980 = matrix(unlist(data(2)))\end{aligned}\)

#The number of bootstrap samples

\(nu = 1000\)

Correlation. 1970.1980= numeric (nu)

#Generate sample correlation

for \(\left( {i in \left( {1: n u} \right)} \right)\) {

#Generate the bootstrap sample with replacement

Sample.1970 = sample (Year1970, length (Year1970), replace = T)

#Generate the bootstrap sample with replacement

Sample.1980 = sample (Year1980, length (Year1980), replace = T)

#Correlation

Correlation.1970.1980(i) = cor (Sample.1970, Sample.1980)

}

Bootstrap Estimate = var (Correlation.1970.1980)}

02

(b) Approximate the bootstrap estimate of the bias of the sample correlation.

In this case, one way to get the bootstrap estimate of the sample is to subtract from\({R^{(i)}},i = 1,2, \ldots ,\nu \)the correlation between the two initial samples. Then, just average the values to obtain the bootstrap estimate of the bias. Using the following code, one obtains the bootstrap estimate of the bias of approximately -0.9456

\#read data

data = read. delim(file = "C:

Users

Exercise.txt",

\(\begin{aligned}{l}Year1970 &= matrix(unlist(data(1)))\\Year1980 &= matrix(unlist(data(2)))\end{aligned}\)

#The number of bootstrap samples

nu = 1000

\(Correlation.1970.1980.bias = numeric(nu)\)

x = cor(Year1970, Year1980)

\#generate sample correlation

for (i in (1: n u)) {

#Generate the bootstrap sample with replacement

\(Sample.1970 = sample(Year1970,length(Year1970),replace = T)\)

#Generate the bootstrap sample with replacement

\(Sample.1980 = sample(Year1980,length(Year1980),replace = T)\)

#Correlation

Correlation.1970. 1980.bias(i) = cor (Sample.1970, Sample.1980) – cor (Year1970, Year1980}

Bootstrap estimate = mean (Correlation.1970. 1980.bias)

\(Estimate Of The Simulation SD = sqrt(var(Correlation.1970.1980.bias)/nu)\)

03

(c) Compute simulation standard errors of the above bootstrap estimates.

Let's first find the solution for (a), that is, for the bootstrap estimate of variance. Such simulation of the standard error of the sample variance uses the following approach. Denote with\({R^{(i)}},i = 1,2, \ldots ,\nu \)the bootstrap sample correlation coefficients. To estimate the variance, one uses the sample variance

\(Z = \frac{1}{\nu }\sum\limits_{i = 1}^\nu {{{\left( {{R^{(i)}} - \bar R} \right)}^2}} \)

The goal is now to find the simulation variance of this $Z$. The delta method can be used. By denoting

\({W^{(i)}} = {R^{(i)2}}\)

the sample variance Z may be written as

\(Z = \frac{1}{\nu }\sum\limits_{i = 1}^\nu {{{\left( {{R^{(i)}} - \bar R} \right)}^2}} = \bar W - {\bar Y^2}\)

Let V be the sample variance of\({W^{(i)}},i = 1,2, \ldots \ldots ,\nu .\)And lastly, the required covariance between the initial bootstrap sample\({R^{(i)}},i = 1,2, \ldots ,\nu \)and the squared\({W^{(i)}},i = 1,2, \ldots ,\nu \) is given by

\(C = \frac{1}{\nu }\sum\limits_{i = 1}^\nu {\left( {{R^{(i)}} - \bar R} \right)} \left( {{W^{(i)}} - \bar W} \right)\)

Finally, the variance of the necessary Z is

\(\hat \sigma _Z^2 = \frac{1}{\nu }\left( {4{{\bar Y}^2}Z - 4\bar YC + V} \right)\)

This is the formula used in the code in the end. Note that the standard deviation is the square root of the variance. The result given by the code below is $0.0031$.

The solution for part (b) is more straightforward, and it is only required to find the sample variance of\({R^{(i)}},i = 1,2, \ldots ,\nu \)minus the sample correlation coefficient as given in the last line of code of part (b). The standard simulation error is 0.0086.

#Read data

\(data = read.delim(file{ = ^{\prime \prime }}C:\)

Users

Exercise. txt",

Year1970 = matrix (unlisted(data (1)))

Year1980 = matrix (unlisted (data (2))

nu =1000

R = numeric(nu)

W = numeric(nu)

#Generate sample correlation

for (i in (1: n u)) {

#Generate the bootstrap sample with replacement

Sample.1970 = sample (Year1970, length (Year1970), replace =T)

Sample.1980 = sample (Year1980, length (Year1980), replace =T)

#Correlation

\(\begin{aligned}{l}R(i) &= cor(Sample.1970,Sample.1980)\\W(i) &= R{(i)^ \wedge }2\end{aligned}\)

}

\(\begin{aligned}{l}Z &= mean(W) - mean{(R)^ \wedge }2\\V &= var(W)\end{aligned}\)

\(C.for.sum = numeric(nu)\)

for ( i in (1: n u)) {

\(C.for.sum(i) = (R(i) - mean(R))*(W(i) - mean(W))\)

\(C = mean(C.for.sum)\)

\(Bootstrap Eestimate Variance = \left( {4*mean{{(R)}^ \wedge }2*Z - 4*mean(R)*C + V} \right)/nu\)

\(Bootstrap Eestimate SD = sqrt(Bootstrap Eestimate Variance)\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Test the gamma pseudo-random number generator on your computer. Simulate 10,000 gamma pseudo-random variables with parameters a and 1 for \(a = 0.5,1,1.5,2,5,\) 10. Then draw gamma quantile plots

In Sec. 10.2, we discussed \({\chi ^2}\) goodness-of-fit tests for composite hypotheses. These tests required computing M.L.E.'s based on the numbers of observations that fell into the different intervals used for the test. Suppose instead that we use the M.L.E.'s based on the original observations. In this case, we claimed that the asymptotic distribution of the \({x^2}\) test statistic was somewhere between two different \({\chi ^2}\) distributions. We can use simulation to better approximate the distribution of the test statistic. In this exercise, assume that we are trying to test the same hypotheses as in Example 10.2.5, although the methods will apply in all such cases.

a. Simulate \(v = 1000\) samples of size \(n = 23\) from each of 10 different normal distributions. Let the normal distributions have means of \(3.8,3.9,4.0,4.1,\) and \(4.2\) Let the distributions have variances of 0.25 and 0.8. Use all 10 combinations of mean and variance. For each simulated sample, compute the \({\chi ^2}\) statistic Q using the usual M.L.E.'s of \(\mu \) , and \({\sigma ^2}.\) For each of the 10 normal distributions, estimate the 0.9,0.95, and 0.99 quantiles of the distribution of Q.

b. Do the quantiles change much as the distribution of the data changes?

c. Consider the test that rejects the null hypothesis if \(Q \ge 5.2.\) Use simulation to estimate the power function of this test at the following alternative: For each \(i,\left( {{X_i} - 3.912} \right)/0.5\) has the t distribution with five degrees of freedom.

Test the standard normal pseudo-random number generator on your computer by generating a sample of size 10,000 and drawing a normal quantile plot. How straight does the plot appear to be?

Let \(U\) have the uniform distribution on the interval\((0,1)\). Show that the random variable \(W\)defined in Eq. (12.4.6) has the p.d.f. \(h\)defined in Eq. (12.4.5).

Consider, once again, the model described in Example \({\bf{7}}{\bf{.5}}{\bf{.10}}{\bf{.}}\) Assume that \({\bf{n = 10}}\) the observed values of \({{\bf{X}}_{\bf{1}}},...,{{\bf{X}}_{{\bf{1}}0}}\) are

\( - 0.92,\,\, - 0.33,\,\, - 0.09,\,\,\,0.27,\,\,\,0.50, - 0.60,\,1.66,\, - 1.86,\,\,\,3.29,\,\,\,2.30\).

a. Fit the model to the observed data using the Gibbs sampling algorithm developed in Exercise. Use the following prior hyperparameters: \({{\bf{\alpha }}_{\bf{0}}}{\bf{ = 1,}}{{\bf{\beta }}_{\bf{0}}}{\bf{ = 1,}}{{\bf{\mu }}_{\bf{0}}}{\bf{ = 0}}\,{\bf{and}}\,{\bf{ }}{{\bf{\lambda }}_{\bf{0}}}{\bf{ = 1}}\)

b. For each i, estimate the posterior probability that \({\rm{ }}{{\rm{x}}_i}\)came for the normal distribution with unknown mean and variance.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.