## Linear models in statisticsLinear models made easy with this unique introduction Linear Models in Statistics discusses classical linear models from a matrix algebra perspective, making the subject easily accessible to readers encountering linear models for the first time. It provides a solid foundation from which to explore the literature and interpret correctly the output of computer packages, and brings together a number of approaches to regression and analysis of variance that more experienced practitioners will also benefit from. With an emphasis on broad coverage of essential topics, Linear Models in Statistics carefully develops the basic theory of regression and analysis of variance, illustrating it with examples from a wide range of disciplines. Other features of this remarkable work include: * Easy-to-read proofs and clear explanations of concepts and procedures * Special topics such as multiple regression with random x's and the effect of each variable on R¯2 * Advanced topics such as mixed and generalized linear models as well as logistic and nonlinear regression * The use of real data sets in examples, with all data sets available over the Internet * Numerous theoretical and applied problems, with answers in an appendix * A thorough review of the requisite matrix algebra * Graphs, charts, and tables as well as extensive references |

### ´Ù¸¥ »ç¶÷µéÀÇ ÀÇ°ß - ¼Æò ¾²±â

¼ÆòÀ» Ã£À» ¼ö ¾ø½À´Ï´Ù.

### ¸ñÂ÷

Matrix Algebra | 5 |

Random Vectors and Matrices | 62 |

Multivariate Normal Distribution | 77 |

ÀúÀÛ±Ç | |

Ç¥½ÃµÇÁö ¾ÊÀº ¼½¼Ç 15°³

### ±âÅ¸ ÃâÆÇº» - ¸ðµÎ º¸±â

### ÀÚÁÖ ³ª¿À´Â ´Ü¾î ¹× ±¸¹®

analysis of variance assumption Chapter coefficients columns confidence interval consider Corollary correlation cov(y covariance matrix ct2I data in Table defined degrees of freedom density distributed eigenvalues eigenvector equal Example expected mean squares expressed F-statistic F-test factor following theorem full model full-rank given idempotent illustrate interaction inverse jc's least squares estimator linear combination linear function linear hypothesis linear models linearly independent linearly independent estimable maximum likelihood estimators moment-generating function multiply multivariate normal n x n n x p of rank noncentral nonsingular normal equations Note obtain one-way model p-value parameters partitioned positive definite prediction interval Problem Prove Theorem quadratic forms random variable random vector rank(A reduced model regression residuals result rows sample Section Show side conditions solution SS(M SSEy.x sum of squares symmetric symmetric matrix Theorem 5.5A unbiased estimator y'Ay