/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q29E When sample sizes are equal聽\(\... [FREE SOLUTION] | 91影视

91影视

When sample sizes are equal\(\left( {{J_i} = J} \right)\), the parameters\({\alpha _1},{\alpha _2},...{\alpha _I}\,\)of the alternative parameterization are restricted by\(\Sigma {\alpha _i} = 0\). For unequal sample sizes, the most natural restriction is\(\Sigma {J_i}{\alpha _i} = 0\). Use this to show that

\(E\left( {MSTr} \right) = {\alpha ^2} + \frac{1}{{I - 1}}\Sigma {J_i}\alpha _i^2\)

What is\(E\left( {MSTr} \right)\)when\({H_0}\)is true? (This expectation is correct if\(\Sigma {J_i}{\alpha _i} = 0\,\)is replaced by the restriction\(\Sigma {\alpha _i} = 0\)(or any other single linear restriction on the ai 鈥檚 used to reduce the model to I independent parameters), but\(\Sigma {J_i}{\alpha _i} = 0\)simplifies the algebra and yields natural estimates for the model parameters (in particular,\({\mathop {\,\,\,\,\alpha }\limits^{\,\,\,\,\,\^} _i} = {\mathop X\limits^\_ _i}\, - \,\mathop X\limits^\_ ..\,\,\,\,{H_0}\)).)

Short Answer

Expert verified

The E(MSTr) is

E(MSTr) =\(\frac{1}{{I - 1}}\,\, \cdot \,\,\left( {(I - {\rm{ }}1){\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2} \right)\)

Step by step solution

01

Simplifies the algebra

Denote with

\(\begin{aligned}{l}{x_i} = \sum\limits_{j = 1}^J {{x_{i\,j}}} \,;\\{x_{ \cdot \cdot }} = \sum\limits_{i = 1}^I {\,\sum\limits_{j = 1}^J {\,{x_{i\,j}}} \,;\,} \end{aligned}\)

The mean square is

\(MSTr = \frac{1}{{I - 1}}.\,SSTr\)

Where the treatment sum of squares is

\(SSTr = {\sum\limits_{i = 1}^I {\,\sum\limits_{j = 1}^J \, \left( {{{\mathop x\limits^\_ }_{i\,.}} - \mathop x\limits^\_ ..} \right)} ^2} = \sum\limits_{i = 1}^I \, \frac{1}{{{J_i}}}x_i^2 - \frac{1}{n}{x^2}..\)

02

Value of SSTr

Let's find expected value of the treatment sum of squares using the definition

\(\begin{aligned}{l}E\left( {SSTr} \right) = E\left( {\sum\limits_{i = 1}^I \, \frac{1}{{{J_i}}}x_i^2 - \frac{1}{n}{x^2}..} \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = E\left( {\sum\limits_{i = 1}^I \, \frac{1}{{{J_i}}}{{\left( {J_i^2.{{\mathop x\limits^\_ }_{i.}}} \right)}^2} - \frac{1}{n}{{\left( {n\,\,.\mathop x\limits^\_ ..} \right)}^2}} \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = E\left( {\sum\limits_{i = 1}^I {\,{J_i}.\mathop x\limits^\_ _{i.}^2\,} - n\,\,.\mathop x\limits^\_ _{...}^2} \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = \sum\limits_{i = 1}^I {\,{J_i}E} \left( {\mathop x\limits^\_ _{.i\,.}^2} \right) - n\,\,.\,E\left( {\mathop x\limits^\_ _{..}^2} \right)\end{aligned}\)

\(\begin{aligned}{l} = \mathop \sum \limits_{i = 1}^I {J_i}\,\left( {V\,\left( {{{\mathop x\limits^\_ }_{i.}}} \right) + \left( {E\left( {{{\mathop x\limits^\_ }_{i.}}} \right)} \right)_{..}^2} \right){\rm{ }} - \,\,n\,\,.\,\left( {V\,\left( {{{\mathop x\limits^\_ }_{..}}} \right) + \left( {E\left( {{{\mathop x\limits^\_ }_{..}}} \right)} \right)_{..}^2} \right)\\ = \mathop \sum \limits_{i = 1}^I {J_i}\,\left( {\frac{{{\sigma ^2}}}{{{J_i}}} + \mu _i^2} \right){\rm{ }} - \,\,n\,\,.\,\left( {\frac{{{\sigma ^2}}}{n} + {{\left( {\frac{1}{n}\;\,\,.\,\mathop \sum \limits_{i = 1}^I {J_i}.{\mu _i}\,} \right)}^2}} \right)\\\, = \mathop \sum \limits_{i = 1}^I {J_i}.\frac{{{\sigma ^2}}}{{{J_i}}} + \mathop \sum \limits_{i = 1}^I {J_i}.\mu _i^2 - n\,.\frac{{{\sigma ^2}}}{n} - n\,.\,\frac{1}{{{n^2}}}\,.\,{\left( {\mathop \sum \limits_{i = 1}^I {J_i}.{\mu _i}} \right)^2}\\ = I \cdot {\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i} \cdot {\rm{ }}{(\mu + {\alpha _i})^2} - {\sigma ^2} - \frac{1}{n}\; \cdot \left( {\mathop \sum \limits_{i = 1}^I {J_i}\,{{(\mu + {\alpha _i})}^2}} \right)\end{aligned}\)

\(\begin{aligned}{l} = {\rm{ }}(I - {\rm{ }}1){\rm{ }} \cdot {\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i}{\mu ^2} + {\rm{ }}2{\rm{ }} \cdot \mu \mathop \sum \limits_{i = 1}^I {J_i}{\alpha _i} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2 - \frac{1}{n} \cdot (\mu \cdot \mathop \sum \limits_{i = 1}^I {J_i} + \mathop \sum \limits_{i = 1}^n {J_i}{\alpha _i})\\\, = {\rm{ }}(I - {\rm{ }}1){\sigma ^2} + {\mu ^2} \cdot n + {\rm{ }}2\mu \cdot {\rm{ }}0{\rm{ }} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2 - \frac{1}{n}\; \cdot {\rm{ }}{(n \cdot \mu + 0)^2}\\\, = (I - {\rm{ }}1){\sigma ^2} + n{\mu ^2} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2 - n{\mu ^2}\\\, = (I - {\rm{ }}1){\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2\end{aligned}\)

From which, the expected value of MSTr is

E(MSTr) =\(\frac{1}{{I - 1}}\,\, \cdot \,\,\left( {(I - {\rm{ }}1){\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2} \right)\)

Here the final result is,

E(MSTr) =\(\frac{1}{{I - 1}}\,\, \cdot \,\,\left( {(I - {\rm{ }}1){\sigma ^2} + \mathop \sum \limits_{i = 1}^I {J_i}\alpha _{^i}^2} \right)\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Repeat Exercise supposing that \(\,\,{\overline x _{2.}} = 502.8\)in addition to\(\,\,{\overline x _{3.}} = 427.5.\)

Four types of mortars-ordinary cement mortar(OCM), polymer impregnated mortar(PIM), resin mortar(RM), and polymer cement mortar(PCM)- were subjected to a compression test to measure strength (MPa). Three strength observations for each mortar type are given in the article 鈥淧olymer Mortar Composite Matrices For Maintance- Free Highly Durable Ferrocement鈥漚nd are reproduced here. Construct an ANOVA table. Using a \(.05\)significance level , determine whether the data suggests that the true mean strength is not the same for all the four mortar types. If you determine that the true mean strengths are not all equal, use Turkey鈥檚 method to identify the significant differences.

\(\begin{aligned}{*{20}{c}}{OCM}&{32.15}&{35.53}&{34.20}\\{PIM}&{126.32}&{126.80}&{134.79}\\{RM}&{117.91}&{115.02}&{114.58}\\{PCM}&{29.09}&{30.87}&{29.80}\end{aligned}\)

consider the accompanying data on plant growth after the application of five different types of growth hormone.

\(\begin{aligned}{l}1:\\2:\\3:\\4:\\5:\end{aligned}\) \(\begin{aligned}{l}13\\21\\18\\7\\6\,\end{aligned}\) \(\begin{aligned}{l}17\\13\\15\\11\\11\,\,\end{aligned}\) \(\begin{aligned}{l}7\\20\\20\\18\\15\,\,\end{aligned}\) \(\begin{aligned}{l}14\\17\\17\\10\\8\end{aligned}\)

  1. Perform an F at level \(\alpha = .05\)
  2. What happens when Tukey鈥檚 procedure is applied?

Exercise \(10.7\) described an experiment in which \(26\)resistivity observations were made on each of six different concrete mitures. The article cited there gave the following sample means: \(14.18\,,17.94,\,18.00,\,\,25.74,\,\,27.67\) Apply Tukey鈥檚 method with a simultaneous confidence level of \(95\% \)to identify significant differences, and describe your findings \((use\,\,MSE = 13.929)\)

The critical flicker frequency \(\left( {cff} \right)\) is the highest frequency at which a person can detect the flicker in a flickering light source. At frequencies above the cff, the light source appear to be continuous even though it is actually flickering. An investigation carried out to see whether true average cff depends on iris color yielded the following data (based on the article 鈥淭he Effects of Iris Color on Critical Flicker Frequency鈥.

Iris color

1.Brown

2.Green

3.Blue

\({\bf{26}}.{\bf{8}}\)

\({\bf{26}}.{\bf{4}}\)

\({\bf{25}}.{\bf{7}}\)

\({\bf{27}}.{\bf{9}}\)

\({\bf{24}}.{\bf{2}}\)

\({\bf{27}}.{\bf{2}}\)

\({\bf{23}}.{\bf{7}}\)

\({\bf{28}}.{\bf{0}}\)

\({\bf{29}}.{\bf{9}}\)

\({\bf{25}}.{\bf{0}}\)

\({\bf{26}}.{\bf{9}}\)

\({\bf{28}}.{\bf{5}}\)

\({\bf{26}}.{\bf{3}}\)

\({\bf{29}}.{\bf{1}}\)

\({\bf{29}}.{\bf{4}}\)

\({\bf{24}}.{\bf{8}}\)

\({\bf{28}}.{\bf{3}}\)

\({\bf{25}}.{\bf{7}}\)

\({\bf{24}}.{\bf{5}}\)

\({J_i}\)

\({\bf{8}}\)

\({\bf{5}}\)

\({\bf{6}}\)

\({x_i}\)

\({\bf{204}}.{\bf{7}}\)

\({\bf{134}}.{\bf{6}}\)

\({\bf{169}}.{\bf{0}}\)

\({\overline x _i}\)

\({\bf{25}}.{\bf{59}}\)

\({\bf{26}}.{\bf{92}}\)

\({\bf{28}}.{\bf{17}}\)

\(n = 19,{x_{..}} = 508.3\)

  1. State and test the relevant hypotheses at significance level\(.05\)(Hint:\(\sum {\sum {{x_{ij}}^2} = 13659.67,CF = 13598.36} \))
  2. Investigate difference between iris colors with respect to mean cff.
See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.