Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Experimental Methods for the Analysis of Optimization Algorithms (eBook)

eBook Download: PDF
2010 | 2010
XXII, 457 Seiten
Springer Berlin (Verlag)
978-3-642-02538-9 (ISBN)

Lese- und Medienproben

Experimental Methods for the Analysis of Optimization Algorithms -
Systemvoraussetzungen
96,29 inkl. MwSt
(CHF 93,95)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods.

This book consists of methodological contributions on different scenarios of experimental analysis. The first part overviews the main issues in the experimental analysis of algorithms, and discusses the experimental cycle of algorithm development; the second part treats the characterization by means of statistical distributions of algorithm performance in terms of solution quality, runtime and other measures; and the third part collects advanced methods from experimental design for configuring and tuning algorithms on a specific class of instances with the goal of using the least amount of experimentation. The contributor list includes leading scientists in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies.

This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment of optimization algorithms and, consequently, their design.

Foreword 5
Preface 8
Acknowledgments 8
Contents 10
List of Contributors 18
Chapter 1 Introduction 22
1.1 Optimization Algorithms 22
1.2 Analysis of Algorithms 24
1.2.1 Theoretical Analysis 24
1.2.2 Experimental Analysis 24
1.3 Bridging the Gap Between Theoretical and Empirical Analysis 26
1.4 The Need for Statistics 28
1.5 Book Contents 29
References 33
Part I Overview 35
Chapter 2 The Future of Experimental Research 36
2.1 Introduction 36
2.2 Experimental Goals in Computer Science 38
2.2.1 Improving the Performance 39
2.2.2 Understanding 40
2.3 Problems 40
2.3.1 Problems Related to the Experimental Setup 41
2.3.2 Problems Related to the Significance of Experimental Results 41
2.3.3 Problems Related to High-Level Theory 44
2.4 The New Experimentalism 46
2.5 Experimental Models 47
2.5.1 The Role of Models in Science 47
2.5.2 Representational Models 48
2.5.3 A Framework of Models 49
2.5.4 Mayo’s Learning Model 51
2.5.5 Sequential Parameter Optimization 53
2.5.6 The Large n Problem Revisited 57
2.6 Pitfalls of Experimentation with Randomized Algorithms 57
2.6.1 Floor and Ceiling Effects 58
2.6.2 Confounded Effects 59
2.6.3 Fairness in Parameter Settings 59
2.6.4 Failure Reasons and Prevention 60
2.7 Tools: Measures and Reports 61
2.7.1 Measures 61
2.7.2 Reporting Experiments 63
2.8 Methodology, Open Issues, and Development 64
2.9 Summary 65
References 66
Chapter 3 Design and Analysis of Computational Experiments: Overview 69
3.1 Introduction 69
3.2 Classic Designs and Metamodels 72
3.3 Screening: Sequential Bifurcation 75
3.4 Kriging 76
3.4.1 The Kriging Predictor Variance 78
3.4.2 Designs for Kriging 79
3.5 Optimization 81
3.5.1 Response Surface Methodology (RSM) 81
3.5.2 Kriging and Mathematical Programming 83
3.5.3 Taguchian Robust Optimization 85
3.6 Conclusions 87
References 88
Chapter 4 The Generation of Experimental Data for Computational Testing in Optimization 91
4.1 Introduction 91
4.2 A Protocol 94
4.2.1 Generation Principles and Properties 96
4.2.2 Features of the Model 99
4.2.3 Validating the Data 99
4.3 Applications to Optimization 101
4.3.1 Generalized Assignment and Knapsack 102
4.3.2 Supply Chains 103
4.3.3 Scheduling 105
4.3.4 Graphs and Networks 106
4.3.5 Routing 107
4.3.6 Data Mining 108
4.3.7 Stochastic Programming 109
4.3.8 Intractable Problems 110
4.4 Concluding Remarks 112
References 112
Chapter 5 The Attainment-Function Approach to Stochastic Multiobjective Optimizer Assessment and Comparison 120
5.1 Introduction 120
5.2 Statistics and the Attainment-Function Approach 122
5.2.1 Stochastic Optimizers as Statistical Estimators 122
5.2.2 Optimizer Performance as Statistical Estimator Performance 123
5.2.3 Performance Assessment via Estimation and Hypothesis Testing 125
5.3 Multiobjective Optimizer Outcomes 126
5.3.1 Random Nondominated Point Sets 126
5.3.2 Alternative View: The Attained Set 126
5.4 Multiobjective Optimizer Performance 127
5.4.1 Distribution of a General Random Closed Set: The Capacity Functional 128
5.4.2 Distribution of a Random Nondominated Point Set: The K-th-Order Attainment Function 128
5.5 Partial Aspects of Multiobjective Optimizer Performance 130
5.5.1 Distribution Location: The First-Order Attainment Function 130
5.5.2 Distribution Spread: The Variance Function 133
5.5.3 Inter-Point Dependence Structures: Second and Higher-Order Attainment Functions 134
5.6 Multiobjective Optimizer Performance Assessment: Estimation 136
5.7 Multiobjective Optimizer Performance Comparison: Hypothesis Testing 138
5.7.1 Two-Sided Test Problem 139
5.7.2 Permutation Test Procedure 140
5.7.3 Multistage Testing 141
5.7.4 One-Sided Tests 143
5.8 Discussion and Future Perspectives 144
References 145
Chapter 6 Algorithm Engineering: Concepts and Practice 148
6.1 Why Algorithm Engineering? 148
6.1.1 Early Days and the Pen-and-Paper Era 149
6.1.2 Errors 150
6.2 The Algorithm Engineering Cycle 152
6.3 Current Topics and Issues 154
6.3.1 Properties of and Structures in the Input 155
6.3.2 Large Datasets 156
6.3.3 Memory Efficiency 157
6.3.4 Distributed Systems and Parallelism 160
6.3.5 Approximations and Heuristic Algorithms 161
6.3.6 Succinct Data Structures 162
6.3.7 Time-Critical Settings 163
6.3.8 Robustness 163
6.4 Success Stories 164
6.4.1 Shortest Path Computation 164
6.4.2 Full-Text Indexes 167
6.5 Summary and Outlook 170
References 171
Part II Characterizing Algorithm Performance 176
Chapter 7 Algorithm Survival Analysis 177
7.1 Introduction 177
7.2 Modeling Runtime Distributions 178
7.2.1 Basic Quantities and Concepts 179
7.2.2 Censoring 181
7.2.3 Estimation in Survival Analysis 182
7.2.4 Competing Risks 185
7.3 Model-Based Algorithm Selection 187
7.3.1 RTD of an Algorithm Portfolio 189
7.3.2 Model-Based Time Allocators 190
7.3.3 Algorithms as Competing Risks 191
7.4 Experiments 191
7.5 Related Work 195
7.6 Summary and Outlook 196
References 197
Chapter 8 On Applications of Extreme Value Theory in Optimization 201
8.1 Introduction 201
8.2 Extreme Value Theory 202
8.2.1 Extreme Value Theory for Minima 202
8.2.2 Peaks Over Threshold Method (POT) for Minima 205
8.2.3 Assessment of an Optimizer 207
8.3 Experiments Using Random Search 208
8.3.1 Samples with Simulations Near the Optimum 209
8.3.2 Samples with Simulations Away from the Optimum 210
8.4 Analytical Results 211
8.5 Experiments Using Evolution Strategies 215
8.6 Summary 221
References 222
Chapter 9 Exploratory Analysis of Stochastic Local Search Algorithms in Biobjective Optimization 224
9.1 Introduction 224
9.2 Stochastic Local Search for Multiobjective Problems 225
9.3 Examination of the Attainment Surfaces 227
9.3.1 The eafplot.p1 Program 228
9.3.2 Example Application of eafplot.p1 229
9.4 Examining the Differences Between EAFs 230
9.5 Examples 232
9.5.1 Effect of Problem Structure 232
9.5.2 Differences in Algorithmic Performance 233
9.5.3 Biased Behavior 234
9.6 Summary and Outlook 235
References 236
Part III Algorithm Configuration and Tuning 238
Chapter 10 Mixed Models for the Analysis of Optimization Algorithms 239
10.1 Introduction 239
10.2 Experimental Designs and Statistical Modeling 242
10.2.1 Case(-,q(-),r): Random-Effects Design 243
10.2.2 Case (N ,q(-),r):Mixed-Effects Design 247
10.2.3 Case (-,q(M),r): Nested-Effects Design 250
10.2.4 Case (N,q(M),r): General Mixed-Effects Design 251
10.3 An Application Example in Optimization Heuristic Design 254
10.3.1 Definitions and Problem Formulation 254
10.3.2 Local Search Algorithms 255
10.3.3 Problem Instances 256
10.4 Numerical Examples 256
10.4.1 Case (-,q(-),r): Random-Effects Design 257
10.4.2 Case (N ,q(-),r):Mixed-Effects Design 261
10.4.3 Case (-,q(M),r): Nested Design 270
10.4.4 Case (N,q(M),r): General Design 272
10.5 Summary and Outlook 274
References 276
Chapter 11 Tuning an Algorithm Using Design of Experiments 279
11.1 Introduction 279
11.2 Research Questions Addressed with DOE 280
11.3 Experiment Designs 280
11.3.1 Full and 2k Factorial Designs 281
11.3.2 Fractional Factorial Designs 281
11.3.3 Response Surface Designs 284
11.3.4 Efficiency of Fractional Factorial Designs 285
11.4 Error, Significance, Power, and Replicates 285
11.5 Benchmarking the Experimental Testbed 286
11.6 Case Study 287
11.6.1 Problem Instances 287
11.6.2 Stopping Criterion 288
11.6.3 Response Variables 288
11.6.4 Factors, Levels and Ranges 288
11.6.5 Model Fitting 291
11.6.6 Results 293
11.6.7 Discussion 298
11.6.8 Summary 299
References 299
Chapter 12 Using Entropy for Parameter Analysis of Evolutionary Algorithms 301
12.1 Introduction and Background 301
12.2 Evolutionary Algorithms 302
12.3 EA Design, EA Parameters 305
12.4 Shannon and Differential Entropy 308
12.4.1 Using Success Ranges for Relevance Estimation 308
12.4.2 Shannon Entropy 309
12.4.3 Using the Shannon Entropy for Relevance Estimation 309
12.4.4 Differential Entropy 311
12.4.5 Joint Entropy 312
12.5 Estimating Entropy 312
12.5.1 REVAC: the Algorithm 314
12.5.2 REVAC: the Data Generated 315
12.6 Case Study 316
12.6.1 Experimental Setup 317
12.6.2 Entropy of Parameters 318
12.6.3 Entropy of Operators 319
12.6.4 Entropy of EAs 320
12.7 Conclusions 321
Acknowledgement 322
References 322
Chapter 13 F-Race and Iterated F-Race: An Overview 325
13.1 Introduction 325
13.2 The Algorithm Configuration Problem 327
13.2.1 The Algorithm Configuration Problem 327
13.2.2 Types of Parameters 329
13.3 F-Race 330
13.3.1 The Racing Approach 330
13.3.2 The Peculiarity of F-Race 331
13.4 The Sampling Strategy for F-Race 334
13.4.1 Full Factorial Design 334
13.4.2 Random Sampling Design 334
13.4.3 Iterated F-Race 335
13.4.4 An Example Iterated F-Race Algorithm 337
13.5 Case Studies 339
13.5.1 Case Study 1: MMAS under Four Parameters 340
13.5.2 Case Study 2: MMAS under Seven Parameters 341
13.5.3 Case Study 3: ACOTSP under 12 Parameters 342
13.6 A Review of F-RaceApplications 343
13.7 Summary and Outlook 346
References 346
Chapter 14The Sequential Parameter Optimization Toolbox 351
14.1 Introduction 351
14.2 Applications 352
14.2.1 Bioinformatics 352
14.2.2 Water-Resource Management 353
14.2.3 Mechanical Engineering 353
14.2.4 Biogas 353
14.2.5 Shipbuilding 354
14.2.6 Fuzzy Operator Trees 354
14.3 Objectives 354
14.4 Elements of the SPOT Framework 355
14.4.1 The General SPOT Scheme 355
14.4.2 SPOT Tasks 356
14.4.3 Running SPOT 358
14.5 Statistical Considerations 359
14.5.1 Sequential Models 359
14.5.2 Residuals and Variance 362
14.5.3 Standardized Variables and Transformations 362
14.5.4 Design Considerations and the Region of Interest 363
14.6 A Case Study: Simple Regression Applied to Evolution Strategies 364
14.6.1 Pre-experimental Planning 364
14.6.2 Performing the First Regression Analysis 365
14.6.3 Steepest Descent 368
14.7 Additional Model Considerations 371
14.8 The Automated Mode 374
14.9 Summary 374
References 375
Chapter 15 Sequential Model-Based Parameter Optimization: an Experimental Investigation of Automated and Interactive Approaches 377
15.1 Introduction 378
15.2 Target Algorithms and Experimental Setup 381
15.3 Existing Methods for Sequential Model-Based Optimization of Noisy Functions 383
15.3.1 General Gaussian Process Regression 383
15.3.2 A Common Framework for Sequential Model-Based Optimization 385
15.3.3 Empirical Comparison of SKO and SPO 390
15.4 Model Quality 394
15.4.1 Choosing the Initial Design 394
15.4.2 Transforming Performance Data 396
15.5 Sequential Experimental Design 397
15.5.1 Intensification Mechanism 398
15.5.2 Expected Improvement Criterion 405
15.5.3 Overall Evaluation 407
15.6 Interactive Exploration of Parameter Space 408
15.6.1 Using SPOT Interactively 409
15.6.2 Further Interactive Tuning Results 416
15.6.3 Comparison of Solutions Found Automatically and Interactively 421
15.6.4 Discussion of the Interactive Approach 422
15.7 Conclusions and Future Work 423
Appendix 424
References 424
Appendix 429
Appendix A A Brief Introduction to Inferential Statistics 430
A.1 Introduction 430
A.1.1 Random Variables 432
A.1.2 Examples of Statistical Models 434
A.2 Point Estimation 439
A.3 Hypothesis Testing 444
A.4 Confidence Intervals 453
A.5 Regression and Modeling 455
A.5.1 Linear Regression 455
A.5.2 Model Fitting 460
References 463
Index 465

Erscheint lt. Verlag 2.11.2010
Zusatzinfo XXII, 457 p.
Verlagsort Berlin
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Programmiersprachen / -werkzeuge
Mathematik / Informatik Mathematik Statistik
Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
Technik
Wirtschaft Betriebswirtschaft / Management Planung / Organisation
Schlagworte Algorithm Engineering • algorithms • combinatorial optimization • Complexity • evolutionary algorithms • experimental analysis • Experiment designs • extreme value theory • F-race • Heuristics • Inferential statistics • Local Search • Multiobjective Optimization • OPT • Optimization • Scheduling • Sequential parameter optimization (SPO)
ISBN-10 3-642-02538-2 / 3642025382
ISBN-13 978-3-642-02538-9 / 9783642025389
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 5,3 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Das umfassende Handbuch

von Johannes Ernesti; Peter Kaiser

eBook Download (2023)
Rheinwerk Computing (Verlag)
CHF 30,70
Das Handbuch für Webentwickler

von Philip Ackermann

eBook Download (2023)
Rheinwerk Computing (Verlag)
CHF 34,10
Deterministische und randomisierte Algorithmen

von Volker Turau; Christoph Weyer

eBook Download (2024)
De Gruyter (Verlag)
CHF 63,45