/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q13E Show that a constant d can be ad... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Show that a constant d can be added to (or subtracted from) each \({x_{ij}}\)without affecting any of the ANOVA sums of squares.

b. Suppose that each \({x_{ij}}\)is multiplied by a nonzero constant c. How does this affect the ANOVA sums of squares? How does this affect the values of the F statistics\({F_A}\;and\;{F_B}\)? What effect does "coding" the data by \({y_{ij}} = c{x_{ij}} + d\)have on the conclusions resulting from the ANOVA procedures?

Short Answer

Expert verified

The solution for the without affecting any of the ANOVA sums of squares are

\({\rm{a}}{\rm{.\;}}S{S_y} = S{S_x}\)For every sum of square;\({\rm{\;b}}{\rm{.\;}}S{S_y} = {c^2}S{S_x}\)for every sum of square; however, the transformation does not affect conclusion from the ANOVA procedure.

Step by step solution

01

Find the average of measurements of factors A and B

(a) Denote with \({\bar X_i}\), the average of measurements obtained when factor A is held at level i

with \({\bar X_{ \cdot j}}\)the average of measurements obtained when factor B is held at level j

and with \({\bar X_{..}}\)the grand mean

Observed values are denoted with small \(x\)instead of big X. The notations without line over X are just the sums.

02

Find the sum of squares with degree of freedom

The sum of squares are given by

With degrees of freedom,

\(\begin{aligned}{*{20}{c}}{d{f_T} = IJ - 1}\\{d{f_A} = I - 1}\\{d{f_B} = J - 1}\\{d{f_E} = (I - 1)(J - 1).}\end{aligned}\)

03

Find sum of squares and difference

In every sum of squares there are differences

\(\begin{aligned}{*{20}{c}}{{x_{ij}} - {{\bar x}_{..}}}\\{{{\bar x}_{i.}} - {{\bar x}_{..}}}\\{{{\bar x}_{.j}} - {{\bar x}_{..}}}\end{aligned}\)

so, look at those two and see what they transform to after the given transformation

\(\begin{aligned}{*{20}{c}}{{y_{ij}} = {x_{ij}} + d.}\\{{y_{ij}} - {{\bar y}_{..}}}\end{aligned}\)

.

Finally, the third difference, analogously is

\({\bar y_{.j}} - {\bar y_{..}} = {\bar x_{ \cdot j}} - {\bar x_{..}}\)

04

Use sum of squares and find sum of square errors

Obviously, from the mentioned, the sums of squares are

For the sum of squares errors, in the same manner you can prove that

\({y_{ij}} - {\bar y_{i.}} - {\bar y_{.j}} + {\bar y_{..}} = {x_{ij}} - {\bar x_{i.}} - {\bar x_{.j}} + {\bar x_{..}}\)

05

Find sum of squares and difference

(b)As mentioned, in every sum of squares there are differences

\(\begin{aligned}{*{20}{c}}{{x_{ij}} - {{\bar x}_ \ldots }}\\{{{\bar x}_{i.}} - {{\bar x}_ \ldots }}\\{{{\bar x}_{ \cdot j}} - {{\bar x}_{ \ldots .}}}\end{aligned}\)

so, look at those two and see what they transform to after the given transformation

\({y_{ij}} = c \cdot {x_{ij}}\)

\({y_{ij}} - {\bar y_{..}}\)

Finally, the third difference, analogously is

\({\bar y_{.j}} - {\bar y_{..}} = c \cdot \left( {{{\bar x}_{.j}} - {{\bar x}_{..}}} \right).\)

06

Use sum of squares and find sum of square errors

Obviously, from the mentioned, the sum of squares are

For the sum of squares errors, in the same manner you can prove that

\({y_{ij}} - {\bar y_{i.}} - {\bar y_{.j}} + {\bar y_{..}} = c \cdot \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)\)

07

Prove the statistic for data set

\(\begin{aligned}{l}{F_{Ax}} = \frac{{\frac{1}{{I - 1}} \cdot SSA}}{{\frac{1}{{(I - 1)(J - I)}} \cdot SSE}} = \frac{{\frac{1}{{I - 1}} \cdot \sum \sum {{\left( {{{\bar x}_{i.}} - {{\bar x}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{I - 1}} \cdot {c^2} \cdot \sum \sum {{\left( {{{\bar x}_{i.}} - {{\bar x}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot {c^2} \cdot \sum \sum \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{I - 1}} \cdot \sum \sum {{\left( {c \cdot \left( {{{\bar x}_{i.}} - {{\bar x}_{..}}} \right)} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {c \cdot \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{I - 1}} \cdot \sum \sum {{\left( {{{\bar y}_{i.}} - {{\bar y}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {{y_{ij}} - {{\bar y}_{i.}} - {{\bar y}_{.j}} + {{\bar y}_{..}}} \right)} \right)}^2}}} = {F_{Ay}}\end{aligned}\)

It has been proved that \({F_{Ax}} = {F_{Ay}};{\rm{\;thus, the\;}}F\)statistic is the same for both data sets.

08

Prove the statistic for data set and transform to ANOVA procedure

\(\begin{aligned}{l}{F_{Bx}} = \frac{{\frac{1}{{I - 1}} \cdot SSB}}{{\frac{1}{{(I - 1)(J - 1)}} \cdot SSE}} = \frac{{\frac{1}{{J - 1}} \cdot \sum \sum {{\left( {{{\bar x}_{.j}} - {{\bar x}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{J - 1}} \cdot {c^2} \cdot \sum \sum {{\left( {{{\bar x}_{.j}} - {{\bar x}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot {c^2} \cdot \sum \sum \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{j - 1}} \cdot \sum \sum {{\left( {c \cdot \left( {{{\bar x}_{.j}} - {{\bar x}_{..}}} \right)} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {c \cdot \left( {{x_{ij}} - {{\bar x}_{i.}} - {{\bar x}_{.j}} + {{\bar x}_{..}}} \right)} \right)} \right)}^2}}}\\ = \frac{{\frac{1}{{J - 1}} \cdot \sum \sum {{\left( {{{\bar y}_{.j}} - {{\bar y}_{..}}} \right)}^2}}}{{{{\left. {\frac{1}{{(I - 1)(J - 1)}} \cdot \sum \sum \left( {{y_{ij}} - {{\bar y}_{i.}} - {{\bar y}_{.j}} + {{\bar y}_{..}}} \right)} \right)}^2}}} = {F_{By}}\end{aligned}\)

It has been proved that \({F_{Bx}} = {F_{By}}{\rm{; thus, the\;}}F\)statistic is the same for both data sets.

The transformation

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

A two-way ANOVA was carried out to assess the impact of type of farm (government agricultural settlement, established, individual) and tractor maintenance method (preventive, predictive, running, corrective, overhauling, breakdown) on the response variable maintenance practice contribution. There were two observations for each combination of factor levels. The resulting sums of squares were\(SSA = 35.75(A = \;type of farm),\;SSB = 861.20,SSAB = 603.51,\;and\;SSE = 341.82\) of Farm Practice Maintenance and Costs in Nigeria, "J. of Quality in Maintenance Engr., 2005: 152-168). Assuming both factor effects to be fixed, construct an ANOVA table, test for the presence of interaction, and then test for the presence of main effects for each factor (all using level.01).

Show how a \(10 \times (1 - \alpha ){\rm{\% }} + Cl\) for \({\alpha _i} - {\alpha _i}\)can be obtained. Then compute a 95% interval for \({\alpha _2} - {\alpha _3}\)using the data from Exercise 19. (Hint: With \({\alpha _2} - {\alpha _3}\)the result of Exercise 24(a) indicates how to obtain \(\hat \theta \). Then compute \(V(\hat \theta )\)and\(\sigma _{\dot \theta }^{}\), and obtain an estimate of \(\sigma _{\dot \theta }^{}\)by using \(\sqrt {{\rm{MSE}}} \) to estimate a (which identifies the appropriate number of df).)

Use the fact that \(E\left( {{X_{ij}}} \right) = \mu \pm {\alpha _i} + {\beta _j}\)with \(\Sigma {\alpha _i} = \Sigma {\beta _j} = 0\) to show that\(E\left( {{{\bar X}_{i \times }} - {{\bar X}_{..}}} \right) = {\alpha _i}\). , so that \({\hat \alpha _i} = {\bar X_{i \times }} - \bar X\)is an unbiased estimator for\({\alpha _i}\).

In an automated chemical coating process, the speed with which objects on a conveyor belt are passed through a chemical spray (belt speed), the amount of chemical sprayed (spray volume), and the brand of chemical used (brand) are factors that may affect the uniformity of the coating applied. A replicated 23 experiment was conducted in an effort to increase the coating uniformity. In the following table, higher values of the response variable are associated with higher surface uniformity:

Surface Uniformity

Suppose that in the experiment described in Exercise 6 the five houses had actually been selected at random from among those of a certain age and size, so that factor B is random rather than fixed. Test \({H_0}:\sigma _B^2 = 0\) versus \({H_a}:\sigma _B^2 > 0\)using a level .01 test.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.