Examples and Problems in Mathematical Statistics
John Wiley & Sons Inc (Verlag)
978-1-118-60550-9 (ISBN)
Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises
With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results.
Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features:
Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving
More than 430 unique exercises with select solutions
Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis
Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.
SHELEMYAHU ZACKS, PHD, is Distinguished Professor in the Department of Mathematical Sciences at Binghamton University. He has published several books and more than 170 journal articles on the design and analysis of experiments, statistical control of stochastic processes, statistical decision theory, sequential analysis, reliability, statistical methods in logistics, and sampling from finite populations. A Fellow of the American Statistical Association, Institute of Mathematical Sciences, and American Association for the Advancement of Sciences, Dr. Zacks is the author of Stage-Wise Adaptive Designs, also published by Wiley.
Preface xv
List of Random Variables xvii
List of Abbreviations xix
1 Basic Probability Theory 1
Part I: Theory 1
1.1 Operations on Sets 1
1.2 Algebra and σ-Fields 2
1.3 Probability Spaces 4
1.4 Conditional Probabilities and Independence 6
1.5 Random Variables and Their Distributions 8
1.6 The Lebesgue and Stieltjes Integrals 12
1.6.1 General Definition of Expected Value: The Lebesgue Integral 12
1.6.2 The Stieltjes–Riemann Integral 17
1.6.3 Mixtures of Discrete and Absolutely Continuous Distributions 19
1.6.4 Quantiles of Distributions 19
1.6.5 Transformations 20
1.7 Joint Distributions Conditional Distributions and Independence 21
1.7.1 Joint Distributions 21
1.7.2 Conditional Expectations: General Definition 23
1.7.3 Independence 26
1.8 Moments and Related Functionals 26
1.9 Modes of Convergence 35
1.10 Weak Convergence 39
1.11 Laws of Large Numbers 41
1.11.1 The Weak Law of Large Numbers (WLLN) 41
1.11.2 The Strong Law of Large Numbers (SLLN) 42
1.12 Central Limit Theorem 44
1.13 Miscellaneous Results 47
1.13.1 Law of the Iterated Logarithm 48
1.13.2 Uniform Integrability 48
1.13.3 Inequalities 52
1.13.4 The Delta Method 53
1.13.5 The Symbols op and Op55
1.13.6 The Empirical Distribution and Sample Quantiles 55
Part II: Examples 56
Part III: Problems 73
Part IV: Solutions to Selected Problems 93
2 Statistical Distributions 106
Part I: Theory 106
2.1 Introductory Remarks 106
2.2 Families of Discrete Distributions 106
2.2.1 Binomial Distributions 106
2.2.2 Hypergeometric Distributions 107
2.2.3 Poisson Distributions 108
2.2.4 Geometric Pascal and Negative Binomial Distributions 108
2.3 Some Families of Continuous Distributions 109
2.3.1 Rectangular Distributions 109
2.3.2 Beta Distributions 111
2.3.3 Gamma Distributions 111
2.3.4 Weibull and Extreme Value Distributions 112
2.3.5 Normal Distributions 113
2.3.6 Normal Approximations 114
2.4 Transformations 118
2.4.1 One-to-One Transformations of Several Variables 118
2.4.2 Distribution of Sums 118
2.4.3 Distribution of Ratios 118
2.5 Variances and Covariances of Sample Moments 120
2.6 Discrete Multivariate Distributions 122
2.6.1 The Multinomial Distribution 122
2.6.2 Multivariate Negative Binomial 123
2.6.3 Multivariate Hypergeometric Distributions 124
2.7 Multinormal Distributions 125
2.7.1 Basic Theory 125
2.7.2 Distribution of Subvectors and Distributions of Linear Forms 127
2.7.3 Independence of Linear Forms 129
2.8 Distributions of Symmetric Quadratic Forms of Normal Variables 130
2.9 Independence of Linear and Quadratic Forms of Normal Variables 132
2.10 The Order Statistics 133
2.11 t-Distributions 135
2.12 F-Distributions 138
2.13 The Distribution of the Sample Correlation 142
2.14 Exponential Type Families 144
2.15 Approximating the Distribution of the Sample Mean: Edgeworth and Saddlepoint Approximations 146
2.15.1 Edgeworth Expansion 147
2.15.2 Saddlepoint Approximation 149
Part II: Examples 150
Part III: Problems 167
Part IV: Solutions to Selected Problems 181
3 Sufficient Statistics and the Information in Samples 191
Part I: Theory 191
3.1 Introduction 191
3.2 Definition and Characterization of Sufficient Statistics 192
3.2.1 Introductory Discussion 192
3.2.2 Theoretical Formulation 194
3.3 Likelihood Functions and Minimal Sufficient Statistics 200
3.4 Sufficient Statistics and Exponential Type Families 202
3.5 Sufficiency and Completeness 203
3.6 Sufficiency and Ancillarity 205
3.7 Information Functions and Sufficiency 206
3.7.1 The Fisher Information 206
3.7.2 The Kullback–Leibler Information 210
3.8 The Fisher Information Matrix 212
3.9 Sensitivity to Changes in Parameters 214
3.9.1 The Hellinger Distance 214
Part II: Examples 216
Part III: Problems 230
Part IV: Solutions to Selected Problems 236
4 Testing Statistical Hypotheses 246
Part I: Theory 246
4.1 The General Framework 246
4.2 The Neyman–Pearson Fundamental Lemma 248
4.3 Testing One-Sided Composite Hypotheses in MLR Models 251
4.4 Testing Two-Sided Hypotheses in One-Parameter Exponential Families 254
4.5 Testing Composite Hypotheses with Nuisance Parameters—Unbiased Tests 256
4.6 Likelihood Ratio Tests 260
4.6.1 Testing in Normal Regression Theory 261
4.6.2 Comparison of Normal Means: The Analysis of Variance 265
4.7 The Analysis of Contingency Tables 271
4.7.1 The Structure of Multi-Way Contingency Tables and the Statistical Model 271
4.7.2 Testing the Significance of Association 271
4.7.3 The Analysis of 2 × 2 Tables 273
4.7.4 Likelihood Ratio Tests for Categorical Data 274
4.8 Sequential Testing of Hypotheses 275
4.8.1 The Wald Sequential Probability Ratio Test 276
Part II: Examples 283
Part III: Problems 298
Part IV: Solutions to Selected Problems 307
5 Statistical Estimation 321
Part I: Theory 321
5.1 General Discussion 321
5.2 Unbiased Estimators 322
5.2.1 General Definition and Example 322
5.2.2 Minimum Variance Unbiased Estimators 322
5.2.3 The Cramér–Rao Lower Bound for the One-Parameter Case 323
5.2.4 Extension of the Cramér–Rao Inequality to Multiparameter Cases 326
5.2.5 General Inequalities of the Cramér–Rao Type 327
5.3 The Efficiency of Unbiased Estimators in Regular Cases 328
5.4 Best Linear Unbiased and Least-Squares Estimators 331
5.4.1 BLUEs of the Mean 331
5.4.2 Least-Squares and BLUEs in Linear Models 332
5.4.3 Best Linear Combinations of Order Statistics 334
5.5 Stabilizing the LSE: Ridge Regressions 335
5.6 Maximum Likelihood Estimators 337
5.6.1 Definition and Examples 337
5.6.2 MLEs in Exponential Type Families 338
5.6.3 The Invariance Principle 338
5.6.4 MLE of the Parameters of Tolerance Distributions 339
5.7 Equivariant Estimators 341
5.7.1 The Structure of Equivariant Estimators 341
5.7.2 Minimum MSE Equivariant Estimators 343
5.7.3 Minimum Risk Equivariant Estimators 343
5.7.4 The Pitman Estimators 344
5.8 Estimating Equations 346
5.8.1 Moment-Equations Estimators 346
5.8.2 General Theory of Estimating Functions 347
5.9 Pretest Estimators 349
5.10 Robust Estimation of the Location and Scale Parameters of Symmetric Distributions 349
Part II: Examples 353
Part III: Problems 381
Part IV: Solutions of Selected Problems 393
6 Confidence and Tolerance Intervals 406
Part I: Theory 406
6.1 General Introduction 406
6.2 The Construction of Confidence Intervals 407
6.3 Optimal Confidence Intervals 408
6.4 Tolerance Intervals 410
6.5 Distribution Free Confidence and Tolerance Intervals 412
6.6 Simultaneous Confidence Intervals 414
6.7 Two-Stage and Sequential Sampling for Fixed Width Confidence Intervals 417
Part II: Examples 421
Part III: Problems 429
Part IV: Solution to Selected Problems 433
7 Large Sample Theory for Estimation and Testing 439
Part I: Theory 439
7.1 Consistency of Estimators and Tests 439
7.2 Consistency of the MLE 440
7.3 Asymptotic Normality and Efficiency of Consistent Estimators 442
7.4 Second-Order Efficiency of BAN Estimators 444
7.5 Large Sample Confidence Intervals 445
7.6 Edgeworth and Saddlepoint Approximations to the Distribution of the MLE: One-Parameter Canonical Exponential Families 446
7.7 Large Sample Tests 448
7.8 Pitman’s Asymptotic Efficiency of Tests 449
7.9 Asymptotic Properties of Sample Quantiles 451
Part II: Examples 454
Part III: Problems 475
Part IV: Solution of Selected Problems 479
8 Bayesian Analysis in Testing and Estimation 485
Part I: Theory 485
8.1 The Bayesian Framework 486
8.1.1 Prior Posterior and Predictive Distributions 486
8.1.2 Noninformative and Improper Prior Distributions 487
8.1.3 Risk Functions and Bayes Procedures 489
8.2 Bayesian Testing of Hypothesis 491
8.2.1 Testing Simple Hypothesis 491
8.2.2 Testing Composite Hypotheses 493
8.2.3 Bayes Sequential Testing of Hypotheses 495
8.3 Bayesian Credibility and Prediction Intervals 501
8.3.1 Credibility Intervals 501
8.3.2 Prediction Intervals 501
8.4 Bayesian Estimation 502
8.4.1 General Discussion and Examples 502
8.4.2 Hierarchical Models 502
8.4.3 The Normal Dynamic Linear Model 504
8.5 Approximation Methods 506
8.5.1 Analytical Approximations 506
8.5.2 Numerical Approximations 508
8.6 Empirical Bayes Estimators 513
Part II: Examples 514
Part III: Problems 549
Part IV: Solutions of Selected Problems 557
9 Advanced Topics in Estimation Theory 563
Part I: Theory 563
9.1 Minimax Estimators 563
9.2 Minimum Risk Equivariant Bayes Equivariant and Structural Estimators 565
9.2.1 Formal Bayes Estimators for Invariant Priors 566
9.2.2 Equivariant Estimators Based on Structural Distributions 568
9.3 The Admissibility of Estimators 570
9.3.1 Some Basic Results 570
9.3.2 The Inadmissibility of Some Commonly Used Estimators 575
9.3.3 Minimax and Admissible Estimators of the Location Parameter 582
9.3.4 The Relationship of Empirical Bayes and Stein-Type Estimators of the Location Parameter in the Normal Case 584
Part II: Examples 585
Part III: Problems 592
Part IV: Solutions of Selected Problems 596
References 601
Author Index 613
Subject Index 617
Erscheint lt. Verlag | 8.4.2014 |
---|---|
Reihe/Serie | Wiley Series in Probability and Statistics |
Verlagsort | New York |
Sprache | englisch |
Maße | 163 x 241 mm |
Gewicht | 1052 g |
Themenwelt | Mathematik / Informatik ► Mathematik ► Angewandte Mathematik |
Mathematik / Informatik ► Mathematik ► Statistik | |
Technik ► Elektrotechnik / Energietechnik | |
ISBN-10 | 1-118-60550-0 / 1118605500 |
ISBN-13 | 978-1-118-60550-9 / 9781118605509 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich