Thursday, July 5, 2012

Sample Questions for Linear Models

1. State and prove Gauss-Markov theorem.
2. In Gauss-Markov setup of linear model, $\underline{Y}=X\underline{\beta}+\epsilon$, obtain
a. Any two functions belonging to error space.
b. An expression for the regression sum of squares, SSR and error sum of squares, SSE.
c. The expression for the SSE subject to m-linearity independent conditions   $\underline{\lambda}^{'}_{(i)}\underline{\beta}=d_i, i=1,2,…,m$ where $\underline{\lambda}^{'}_{(i)}\underline{\beta} $ is an estimable parametric function.
OR
The expression for the conditional error sum of squares, Explain how it can be used in testing the hypothesis related to it?
d. Various sum of squares as a quadratic form.
3. In Gauss-Markov setup of linear model, $\underline{Y}=X\underline{\beta}+\epsilon$, Show that
a. The solution to the normal equations actually minimizes the residual sum of squares.
b. The system of normal equations is always consistent.
c. The necessary and sufficient condition for estimability of parametric function $\underline{\lambda}^{'}_{(i)}\underline{\beta}$ is $\underline{\lambda}^{'}=\underline{\lambda}^{'}H, \underline{\lambda}^{'}=\underline{\lambda}^{'}X$
d. Every linear parametric function $\underline{\lambda}^{'}_{(i)}\underline{\beta}$ is estimable if and only if rank of $X$ is equal to the number of parameters.
e. Any linear function $\underline{a}^{'}\underline{y}$ of observations such that $\underline{a}$ is linear combination of columns of $X$-matrix is BLUE of its expected value. 
f. The covariance between any function belonging to error space and any BLUE is zero.
g. The distribution of SSR is non-central Chi-square. State clearly assumptions made.
h. The distribution of SSE is Chi-square. Further show that it is independently distributed with the distribution of SSR.
i.  Expected value of residual sum of squares is $(n-r){\sigma}^2$, where $r$ is rank of the $X$ and $n$ denotes the total number of observations.
j.  The BLUE of an estimable parametric function is unique almost surely.
k. The BLUE of an estimable parametric function is independent of residuals.
l.  The residuals are not independently distributed
m. Variance-covariance matrix of m-linearly independent estimable parametric functions $\Lambda\hat{\underline\beta} \text{is} \Lambda S^{-}\Lambda^{'}\sigma^2$. Further show that it is non singular, make proper assumptions for developing the results.
n. The difference between the conditional and unconditional error sum of squares is quadratic form in $\underline{y}$ with an idempotent matrix of the form.
4. Discuss the role of functions belonging to error space in getting unbiased estimator of the variance of errors in Gauss Markov model.
5. Define the following
a. Estimation space and estimability of parametric function
b. BLUE
c. Sum of squares due to the hypothesis $H_0$
d. Conditional error sum of squares
e. Full rank model
6. For the linear model $\underline{Y}=X\underline{\beta}+\epsilon$ with $E(\underline{\epsilon})$ and $Cov(\underline{\epsilon})=\sigma^2 D$ where D is symmetric positive definite matrix. Derive BLUE of a linear parametric function, its variance and an expression for SSEReduce this to the standard form and hence obtain $Cov(\hat{\underline\beta})$ where $D=(w_1,w_2,\cdots,w_n)$ is nonsingular matrix. Derive BLUE of a linear parametric function, its variance and an expression for SSE.
7. Obtain condition of estimability of linear parametric function and hence give one set of linearly independent estimable parametric functions with their variances and covariances for
a. One way classification model
b. Two way classification model
8. For completely randomized design using normal equations, obtain the best linear unbiased estimator of an elementary treatment contrast and its variance.
9. Derive F test for testing equality of effects of
a. all treatments in one-way classification model
b. all treatments/blocks in two-way classification model
   Express the various sum of squares used in above test in ANOVA table
10. Consider the model $y_i=\mu_i+\varepsilon_i, i=1,2,\cdots,n$ where the parameters $\mu_i$ subject to condition $\sum_{i=1}^n{\mu_i} =0$. Obtain normal equations and their solutions. Is $\mu_i$ is estimable?
11. Given that $y_i$, $i=1,2,\cdots,n$ are independent normal variates with common variance $\sigma^2$ and mean $\mu$. Write the model in Gauss-Markov setup and obtain the test for $H_0:\mu=0.$
12. For one way model with 4 treatment $T_1,T_2,T_3,T_4$ each with observations  $n_1,n_2,n_3,n_4$ obtain the following
a. Normal equation and their solution
b. Variance of BLUE of contrast in treatment effects.
13. Consider $E(Y_1 )=\beta_1+\beta_2-\beta_3, E(Y_2)=E(Y_4)=\beta_2-\beta_4, E(Y_3 )=\beta_1+\beta_2$ and $Cov(\underline{Y})=\sigma^2 I_n$
a. Check whether above model is full rank model or non full rank model.
b. Obtain rank of estimation space and rank of error space.
c. Obtain one solution of normal equations and hence obtain BLUE of $\beta_1+\beta_2$ if it is estimable parametric function.
14. Let $y_{ij}=\mu_i+\varepsilon_{ij}, i=1,2,3, j=1,2$ where $\varepsilon_{ij}\sim N(0,\sigma^2)$ Obtain $SSH_0$ for testing the hypothesis $H_0:\mu_1=2\mu_2=3\mu_3$
15. Let $E(Y_1)=2\mu, E(Y_2)=\mu$ and 
$Cov(\underline{Y})=$ |2  1|$\sigma^{2}$.
                   |1  2|

Obtain an unbiased estimator of $\sigma^{2}$.

Enter your email address:

Delivered by FeedBurner

No comments: