Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Matrix Algebra Useful for Statistics - Shayle R. Searle, Andre I. Khuri

Matrix Algebra Useful for Statistics

Buch | Hardcover
512 Seiten
2017 | 2nd edition
John Wiley & Sons Inc (Verlag)
978-1-118-93514-9 (ISBN)
CHF 205,10 inkl. MwSt
  • Versand in 10-15 Tagen
  • Versandkostenfrei
  • Auch auf Rechnung
  • Artikel merken
A thoroughly updated guide to matrix algebra and it uses in statistical analysis and features SAS(R), MATLAB(R), and R throughout This Second Edition addresses matrix algebra that is useful in the statistical analysis of data as well as within statistics as a whole.
A thoroughly updated guide to matrix algebra and it uses in statistical analysis and features SAS®, MATLAB®, and R throughout

This Second Edition addresses matrix algebra that is useful in the statistical analysis of data as well as within statistics as a whole. The material is presented in an explanatory style rather than a formal theorem-proof format and is self-contained. Featuring numerous applied illustrations, numerical examples, and exercises, the book has been updated to include the use of SAS, MATLAB, and R for the execution of matrix computations. In addition, André I. Khuri, who has extensive research and teaching experience in the field, joins this new edition as co-author. The Second Edition also:



Contains new coverage on vector spaces and linear transformations and discusses computational aspects of matrices
Covers the analysis of balanced linear models using direct products of matrices
Analyzes multiresponse linear models where several responses can be of interest
Includes extensive use of SAS, MATLAB, and R throughout
Contains over 400 examples and exercises to reinforce understanding along with select solutions
Includes plentiful new illustrations depicting the importance of geometry as well as historical interludes

Matrix Algebra Useful for Statistics, Second Edition is an ideal textbook for advanced undergraduate and first-year graduate level courses in statistics and other related disciplines. The book is also appropriate as a reference for independent readers who use statistics and wish to improve their knowledge of matrix algebra.

THE LATE SHAYLE R. SEARLE, PHD, was professor emeritus of biometry at Cornell University. He was the author of Linear Models for Unbalanced Data and Linear Models and co-author of Generalized, Linear, and Mixed Models, Second Edition, Matrix Algebra for Applied Economics, and Variance Components, all published by Wiley. Dr. Searle received the Alexander von Humboldt Senior Scientist Award, and he was an honorary fellow of the Royal Society of New Zealand.

ANDRÉ I. KHURI, PHD, is Professor Emeritus of Statistics at the University of Florida. He is the author of Advanced Calculus with Applications in Statistics, Second Edition and co-author of Statistical Tests for Mixed Linear Models, all published by Wiley. Dr. Khuri is a member of numerous academic associations, among them the American Statistical Association and the Institute of Mathematical Statistics.

The late Shayle R. Searle, PhD, was professor emeritus of biometry at Cornell University. He was the author of Linear Models for Unbalanced Data and Linear Models and co-author of Generalized, Linear, and Mixed Models, Second Edition, Matrix Algebra for Applied Economics, and Variance Components, all published by Wiley. Dr. Searle received the Alexander von Humboldt Senior Scientist Award, and he was an honorary fellow of the Royal Society of New Zealand. André I. Khuri, PhD, is Professor Emeritus of Statistics at the University of Florida. He is the author of Advanced Calculus with Applications in Statistics, Second Edition and co-author of Statistical Tests for Mixed Linear Models, all published by Wiley. Dr. Khuri is a member of numerous academic associations, among them the American Statistical Association and the Institute of Mathematical Statistics.

Preface xvii

Preface to the First Edition xix

Introduction xxi

About the Companion Website xxxi

Part I Definitions, Basic Concepts, and Matrix Operations 1

1 Vector Spaces, Subspaces, and Linear Transformations 3

1.1 Vector Spaces 3

1.1.1 Euclidean Space 3

1.2 Base of a Vector Space 5

1.3 Linear Transformations 7

1.3.1 The Range and Null Spaces of a Linear Transformation 8

Reference 9

Exercises 9

2 Matrix Notation and Terminology 11

2.1 Plotting of a Matrix 14

2.2 Vectors and Scalars 16

2.3 General Notation 16

Exercises 17

3 Determinants 21

3.1 Expansion by Minors 21

3.1.1 First- and Second-Order Determinants 22

3.1.2 Third-Order Determinants 23

3.1.3 n-Order Determinants 24

3.2 Formal Definition 25

3.3 Basic Properties 27

3.3.1 Determinant of a Transpose 27

3.3.2 Two Rows the Same 28

3.3.3 Cofactors 28

3.3.4 Adding Multiples of a Row (Column) to a Row (Column) 30

3.3.5 Products 30

3.4 Elementary Row Operations 34

3.4.1 Factorization 35

3.4.2 A Row (Column) of Zeros 36

3.4.3 Interchanging Rows (Columns) 36

3.4.4 Adding a Row to a Multiple of a Row 36

3.5 Examples 37

3.6 Diagonal Expansion 39

3.7 The Laplace Expansion 42

3.8 Sums and Differences of Determinants 44

3.9 A Graphical Representation of a 3 × 3 Determinant 45

References 46

Exercises 47

4 Matrix Operations 51

4.1 The Transpose of a Matrix 51

4.1.1 A Reflexive Operation 52

4.1.2 Vectors 52

4.2 Partitioned Matrices 52

4.2.1 Example 52

4.2.2 General Specification 54

4.2.3 Transposing a Partitioned Matrix 55

4.2.4 Partitioning Into Vectors 55

4.3 The Trace of a Matrix 55

4.4 Addition 56

4.5 Scalar Multiplication 58

4.6 Equality and the Null Matrix 58

4.7 Multiplication 59

4.7.1 The Inner Product of Two Vectors 59

4.7.2 A Matrix–Vector Product 60

4.7.3 A Product of Two Matrices 62

4.7.4 Existence of Matrix Products 65

4.7.5 Products With Vectors 65

4.7.6 Products With Scalars 68

4.7.7 Products With Null Matrices 68

4.7.8 Products With Diagonal Matrices 68

4.7.9 Identity Matrices 69

4.7.10 The Transpose of a Product 69

4.7.11 The Trace of a Product 70

4.7.12 Powers of a Matrix 71

4.7.13 Partitioned Matrices 72

4.7.14 Hadamard Products 74

4.8 The Laws of Algebra 74

4.8.1 Associative Laws 74

4.8.2 The Distributive Law 75

4.8.3 Commutative Laws 75

4.9 Contrasts With Scalar Algebra 76

4.10 Direct Sum of Matrices 77

4.11 Direct Product of Matrices 78

4.12 The Inverse of a Matrix 80

4.13 Rank of a Matrix—Some Preliminary Results 82

4.14 The Number of LIN Rows and Columns in a Matrix 84

4.15 Determination of the Rank of a Matrix 85

4.16 Rank and Inverse Matrices 87

4.17 Permutation Matrices 87

4.18 Full-Rank Factorization 89

4.18.1 Basic Development 89

4.18.2 The General Case 91

4.18.3 Matrices of Full Row (Column) Rank 91

References 92

Exercises 92

5 Special Matrices 97

5.1 Symmetric Matrices 97

5.1.1 Products of Symmetric Matrices 97

5.1.2 Properties of AA′ and A′A 98

5.1.3 Products of Vectors 99

5.1.4 Sums of Outer Products 100

5.1.5 Elementary Vectors 101

5.1.6 Skew-Symmetric Matrices 101

5.2 Matrices Having All Elements Equal 102

5.3 Idempotent Matrices 104

5.4 Orthogonal Matrices 106

5.4.1 Special Cases 107

5.5 Parameterization of Orthogonal Matrices 109

5.6 Quadratic Forms 110

5.7 Positive Definite Matrices 113

References 114

Exercises 114

6 Eigenvalues and Eigenvectors 119

6.1 Derivation of Eigenvalues 119

6.1.1 Plotting Eigenvalues 121

6.2 Elementary Properties of Eigenvalues 122

6.2.1 Eigenvalues of Powers of a Matrix 122

6.2.2 Eigenvalues of a Scalar-by-Matrix Product 123

6.2.3 Eigenvalues of Polynomials 123

6.2.4 The Sum and Product of Eigenvalues 124

6.3 Calculating Eigenvectors 125

6.3.1 Simple Roots 125

6.3.2 Multiple Roots 126

6.4 The Similar Canonical Form 128

6.4.1 Derivation 128

6.4.2 Uses 130

6.5 Symmetric Matrices 131

6.5.1 Eigenvalues All Real 132

6.5.2 Symmetric Matrices Are Diagonable 132

6.5.3 Eigenvectors Are Orthogonal 132

6.5.4 Rank Equals Number of Nonzero Eigenvalues for a Symmetric Matrix 135

6.6 Eigenvalues of Orthogonal and Idempotent Matrices 135

6.6.1 Eigenvalues of Symmetric Positive Definite and Positive Semidefinite Matrices 136

6.7 Eigenvalues of Direct Products and Direct Sums of Matrices 138

6.8 Nonzero Eigenvalues of AB and BA 140

References 141

Exercises 141

7 Diagonalization of Matrices 145

7.1 Proving the Diagonability Theorem 145

7.1.1 The Number of Nonzero Eigenvalues Never Exceeds Rank 145

7.1.2 A Lower Bound on r (A − λkI) 146

7.1.3 Proof of the Diagonability Theorem 147

7.1.4 All Symmetric Matrices Are Diagonable 147

7.2 Other Results for Symmetric Matrices 148

7.2.1 Non-Negative Definite (n.n.d.) 148

7.2.2 Simultaneous Diagonalization of Two Symmetric Matrices 149

7.3 The Cayley–Hamilton Theorem 152

7.4 The Singular-Value Decomposition 153

References 157

Exercises 157

8 Generalized Inverses 159

8.1 The Moore–Penrose Inverse 159

8.2 Generalized Inverses 160

8.2.1 Derivation Using the Singular-Value Decomposition 161

8.2.2 Derivation Based on Knowing the Rank 162

8.3 Other Names and Symbols 164

8.4 Symmetric Matrices 165

8.4.1 A General Algorithm 166

8.4.2 The Matrix X′X 166

References 167

Exercises 167

9 Matrix Calculus 171

9.1 Matrix Functions 171

9.1.1 Function of Matrices 171

9.1.2 Matrices of Functions 174

9.2 Iterative Solution of Nonlinear Equations 174

9.3 Vectors of Differential Operators 175

9.3.1 Scalars 175

9.3.2 Vectors 176

9.3.3 Quadratic Forms 177

9.4 Vec and Vech Operators 179

9.4.1 Definitions 179

9.4.2 Properties of Vec 180

9.4.3 Vec-Permutation Matrices 180

9.4.4 Relationships Between Vec and Vech 181

9.5 Other Calculus Results 181

9.5.1 Differentiating Inverses 181

9.5.2 Differentiating Traces 182

9.5.3 Derivative of a Matrix with Respect to Another Matrix 182

9.5.4 Differentiating Determinants 183

9.5.5 Jacobians 185

9.5.6 Aitken’s Integral 187

9.5.7 Hessians 188

9.6 Matrices with Elements That Are Complex Numbers 188

9.7 Matrix Inequalities 189

References 193

Exercises 194

Part II Applications of Matrices in Statistics 199

10 Multivariate Distributions and Quadratic Forms 201

10.1 Variance-Covariance Matrices 202

10.2 Correlation Matrices 203

10.3 Matrices of Sums of Squares and Cross-Products 204

10.3.1 Data Matrices 204

10.3.2 Uncorrected Sums of Squares and Products 204

10.3.3 Means, and the Centering Matrix 205

10.3.4 Corrected Sums of Squares and Products 205

10.4 The Multivariate Normal Distribution 207

10.5 Quadratic Forms and χ2-Distributions 208

10.5.1 Distribution of Quadratic Forms 209

10.5.2 Independence of Quadratic Forms 210

10.5.3 Independence and Chi-Squaredness of Several Quadratic Forms 211

10.5.4 The Moment and Cumulant Generating Functions for a Quadratic Form 211

10.6 Computing the Cumulative Distribution Function of a Quadratic Form 213

10.6.1 Ratios of Quadratic Forms 214

References 215

Exercises 215

11 Matrix Algebra of Full-Rank Linear Models 219

11.1 Estimation of β by the Method of Least Squares 220

11.1.1 Estimating the Mean Response and the Prediction Equation 223

11.1.2 Partitioning of Total Variation Corrected for the Mean 225

11.2 Statistical Properties of the Least-Squares Estimator 226

11.2.1 Unbiasedness and Variances 226

11.2.2 Estimating the Error Variance 227

11.3 Multiple Correlation Coefficient 229

11.4 Statistical Properties under the Normality Assumption 231

11.5 Analysis of Variance 233

11.6 The Gauss–Markov Theorem 234

11.6.1 Generalized Least-Squares Estimation 237

11.7 Testing Linear Hypotheses 237

11.7.1 The Use of the Likelihood Ratio Principle in Hypothesis Testing 239

11.7.2 Confidence Regions and Confidence Intervals 241

11.8 Fitting Subsets of the x-Variables 246

11.9 The Use of the R(.|.) Notation in Hypothesis Testing 247

References 249

Exercises 249

12 Less-Than-Full-Rank Linear Models 253

12.1 General Description 253

12.2 The Normal Equations 256

12.2.1 A General Form 256

12.2.2 Many Solutions 257

12.3 Solving the Normal Equations 257

12.3.1 Generalized Inverses of X′X 258

12.3.2 Solutions 258

12.4 Expected Values and Variances 259

12.5 Predicted y-Values 260

12.6 Estimating the Error Variance 261

12.6.1 Error Sum of Squares 261

12.6.2 Expected Value 262

12.6.3 Estimation 262

12.7 Partitioning the Total Sum of Squares 262

12.8 Analysis of Variance 263

12.9 The R(⋅|⋅) Notation  265

12.10 Estimable Linear Functions 266

12.10.1 Properties of Estimable Functions 267

12.10.2 Testable Hypotheses 268

12.10.3 Development of a Test Statistic for H0 269

12.11 Confidence Intervals 272

12.12 Some Particular Models 272

12.12.1 The One-Way Classification 272

12.12.2 Two-Way Classification, No Interactions, Balanced Data 273

12.12.3 Two-Way Classification, No Interactions, Unbalanced Data 276

12.13 The R(⋅|⋅) Notation (Continued)  277

12.14 Reparameterization to a Full-Rank Model 281

References 282

Exercises 282

13 Analysis of Balanced Linear Models Using Direct Products of Matrices 287

13.1 General Notation for Balanced Linear Models 289

13.2 Properties Associated with Balanced Linear Models 293

13.3 Analysis of Balanced Linear Models 298

13.3.1 Distributional Properties of Sums of Squares 298

13.3.2 Estimates of Estimable Linear Functions of the Fixed Effects 301

References 307

Exercises 308

14 Multiresponse Models 313

14.1 Multiresponse Estimation of Parameters 314

14.2 Linear Multiresponse Models 316

14.3 Lack of Fit of a Linear Multiresponse Model 318

14.3.1 The Multivariate Lack of Fit Test 318

References 323

Exercises 324

Part III Matrix Computations and Related Software 327

15 SAS/IML 329

15.1 Getting Started 329

15.2 Defining a Matrix 329

15.3 Creating a Matrix 330

15.4 Matrix Operations 331

15.5 Explanations of SAS Statements Used Earlier in the Text 354

References 357

Exercises 358

16 Use of MATLAB in Matrix Computations 363

16.1 Arithmetic Operators 363

16.2 Mathematical Functions 364

16.3 Construction of Matrices 365

16.3.1 Submatrices 365

16.4 Two- and Three-Dimensional Plots 371

16.4.1 Three-Dimensional Plots 374

References 378

Exercises 379

17 Use of R in Matrix Computations 383

17.1 Two- and Three-Dimensional Plots 396

17.1.1 Two-Dimensional Plots 397

17.1.2 Three-Dimensional Plots 404

References 408

Exercises 408

Appendix 413

Index 475

Erscheinungsdatum
Reihe/Serie Wiley Series in Probability and Statistics
Verlagsort New York
Sprache englisch
Maße 180 x 258 mm
Gewicht 1021 g
Themenwelt Mathematik / Informatik Mathematik Algebra
Mathematik / Informatik Mathematik Statistik
Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
ISBN-10 1-118-93514-4 / 1118935144
ISBN-13 978-1-118-93514-9 / 9781118935149
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich