Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Automatic Differentiation: Applications, Theory, and Implementations (eBook)

eBook Download: PDF
2006 | 2006
XVIII, 370 Seiten
Springer Berlin (Verlag)
978-3-540-28438-3 (ISBN)

Lese- und Medienproben

Automatic Differentiation: Applications, Theory, and Implementations -
Systemvoraussetzungen
213,99 inkl. MwSt
(CHF 208,95)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

This collection covers the state of the art in automatic differentiation theory and practice. Practitioners and students will learn about advances in automatic differentiation techniques and strategies for the implementation of robust and powerful tools. Computational scientists and engineers will benefit from the discussion of applications, which provide insight into effective strategies for using automatic differentiation for design optimization, sensitivity analysis, and uncertainty quantification.



Written for: Computational scientists

Keywords: automatic differentiation, optimization, sensitivity analysis.

Preface 5
Contents 7
List of Contributors 11
Perspectives on Automatic Differentiation: Past, Present, and Future? 18
1 The Algorithmic Approach 19
2 Transformation of Algorithms 20
3 Development of AD 21
4 Present Tasks and Future Prospects 28
5 Beyond AD 31
Backwards Differentiation in AD and Neural Nets: Past Links and New Opportunities 32
1 Introduction and Summary 32
2 Motivations and Early History 34
3 Types of Differentiation Capability We Have Developed 42
Solutions of ODEs with Removable Singularities 52
1 Introduction 52
2 Notation and Some Polynomial Algebra 53
3 Elementary Functions 53
4 Other Functions 59
5 Higher Order Equations 61
6 Open Questions 62
Automatic Propagation of Uncertainties 64
1 Introduction 64
2 Linear Models 65
3 Contrast with Interval Analysis 68
4 Nonlinear Models 69
5 Implementation with Automatic Differentiation 71
6 Validation of Uncertainty Models 73
7 Way Ahead 75
High-Order Representation of Poincare Maps 76
1 Introduction 76
2 Overview of DA Tools 77
3 Description of the Method 78
4 Examples 80
Computation of Matrix Permanent with Automatic Differentiation 84
1 Introduction 84
2 Formulation 85
3 Methods 87
4 Algorithms 89
5 Discussions and Comments 92
6 Conclusion 93
Computing Sparse Jacobian Matrices Optimally 94
1 Introduction 94
2 Optimal Matrix Compression and Restoration 96
3 Schur Complement Approach 98
4 Combined Determination 100
5 Using Recurring Sparsity Structure in Rows 101
6 Numerical Experiments 103
7 Concluding Remarks 103
Application of AD-based Quasi-Newton Methods to Stiff ODEs 106
1 Introduction 106
2 Quasi-Newton Approximations 108
3 Implementation Details 112
4 Numerical Results 112
5 Conclusions and Outlook 115
Reduction of Storage Requirement by Checkpointing for Time- Dependent Optimal Control Problems in ODEs 116
1 Introduction 116
2 Quasilinearization Techniques 118
3 Nested Reversal Schedules 121
4 Numerical Example 126
5 Conclusion and Outlook 127
Improving the Performance of the Vertex Elimination Algorithm for Derivative Calculation 128
1 Introduction 128
2 Heuristics 130
3 Performance Analysis 131
4 A Statement Reordering Scheme 133
5 A Greedy List Scheduling Algorithm 135
6 Conclusions and Further Work 137
Acknowledgements 137
Flattening Basic Blocks 138
1 The Problem 138
2 Variable Identification 141
3 Removing Ambiguity by Splitting 142
4 Practical Solution 143
5 Splitting into Edge Subgraphs 146
6 Outlook 148
7 Conclusions 150
The Adjoint Data-Flow Analyses: Formalization, Properties, and Applications 152
1 Introduction 152
2 Adjoints by Automatic Differentiation 153
3 Classical Data-Flow Analyses 154
4 Adjoint Data-Flow Analyses 155
5 Application 160
6 Conclusion 162
Semiautomatic Differentiation for Efficient Gradient Computations 164
1 Introduction 164
2 Action on a Mesh 165
3 Some AD Alternatives 166
4 The RAD Package for Reverse AD 169
5 Test Results 170
6 Implications for Source Transformation 174
7 Concluding Remarks 174
Acknowledgment 175
Computing Adjoints with the NAGWare Fortran 95 Compiler 176
1 Aims of the CompAD Project 176
2 Compiler AD – A Motivating Example 177
3 Linearization of the Computational Graph 179
4 Putting AD into the Compiler 181
5 Case Study: Seeding in Forward and Reverse Mode 183
6 Summary, Conclusion, and Outlook 186
Extension of TAPENADE toward Fortran 95 188
1 Introduction 188
2 Nesting of Modules and Subprograms 189
3 Derived Types 190
4 Overloading 191
5 Array Features 193
6 Conclusion 195
A Macro Language for Derivative Definition in ADiMat 198
1 Introduction 198
2 MATLAB in the Context of an AD Tool 199
3 The Macro Language 200
4 Exploiting Structure of a Given Code 205
5 Conclusion and Future Work 205
Transforming Equation-Based Models in Process Engineering 206
1 Introduction 206
2 Dynamic Optimization 207
3 The Intermediate Format CapeML 208
4 ADiCape: Automatic Differentiation of CapeML 210
5 The Overall Structure of the System 213
6 Concluding Remarks and Directions for Future Work 215
Simulation and Optimization of the Tevatron Accelerator 216
1 Introduction 216
2 The Tevatron Accelerator – Machine Description 217
3 The Model, Criteria and Parameters to Control 217
4 Map Methods 218
5 Different Optimization Schemes and Proposals 221
6 Transfer Map Comparison 226
7 Conclusions 226
Periodic Orbits of Hybrid Systems and Parameter Estimation via AD 228
1 Hybrid Systems 230
2 Taylor Series Integration 231
3 Periodic Orbits 232
4 Parameter Estimation 234
5 Software 235
6 Applications 235
7 Conclusions 240
Implementation of Automatic Differentiation Tools for Multicriteria IMRT Optimization 242
1 Introduction 242
2 Methods and Materials 244
3 Results 248
4 Conclusions 251
Application of Targeted Automatic Differentiation to Large-Scale Dynamic Optimization 252
1 Introduction 252
2 Directional Second Order Adjoint Method and AD 254
3 Dynamic Optimization and dSOA 259
4 Conclusions 263
Automatic Differentiation: A Tool for Variational Data Assimilation and Adjoint Sensitivity Analysis for Flood Modeling 266
1 Introduction 267
2 The Adjoint Method 267
3 River Hydraulics 268
4 Catchment Hydrology 274
5 Conclusion 279
Development of an Adjoint for a Complex Atmospheric Model, the ARPS, using TAF 280
1 Introduction 280
2 The Advanced Regional Prediction System 281
3 Transformation of Algorithms in Fortran (TAF) 282
4 Code Generation and Testing 282
5 Testing Results 286
6 Conclusion 289
Tangent Linear and Adjoint Versions of NASA/ GMAO’s Fortran 90 Global Weather Forecast Model 292
1 Introduction 292
2 Finite-volume General Circulation Model 293
3 Applying TAF to fvGCM 293
4 Parallelisation 294
5 Linearising around an External Trajectory 297
6 Performance of Generated Code 298
7 Application Example 299
8 Conclusions 301
Efficient Sensitivities for the Spin-Up Phase 302
1 Introduction 302
2 Spin-up Sensitivities 304
3 Implementation 305
4 Numerical Example 307
5 Performance analysis 308
6 Summary and Outlook 310
Acknowledgements 310
Streamlined Circuit Device Model Development with fREEDA and ADOL-C 312
1 Introduction 312
2 Background 313
3 Transient Circuit Simulation Device Modelling 316
4 Selected Modelling Examples 321
5 Conclusion 324
Adjoint Differentiation of a Structural Dynamics Solver 326
1 Introduction 326
2 Optimisation of the Boom Structure 328
3 Di.erentiation of the BEAM3D Code 330
4 Performance Issues 335
5 Conclusions 336
Acknowledgements 336
A Bibliography of Automatic Differentiation 338
Comments 338
References 340
Index 372

Backwards Differentiation in AD and Neural Nets: Past Links and New Opportunities (p. 15)

Paul J. Werbos
National Science Foundation, Arlington, VA, USA

pwerbos@nsf.gov

Summary.
Backwards calculation of derivatives – sometimes called the reverse mode, the full adjoint method, or backpropagation – has been developed and applied in many fields. This paper reviews several strands of history, advanced capabilities and types of application – particularly those which are crucial to the development of brain-like capabilities in intelligent control and artificial intelligence.

Key words: Reverse mode, backpropagation, intelligent control, reinforcement learning, neural networks, MLP, recurrent networks, approximate dynamic programming, adjoint, implicit systems

1 Introduction and Summary
Backwards differentiation or "the reverse accumulation of derivatives" has been used in many different fields, under different names, for different purposes. This paper will review that part of the history and concepts which I experienced directly. More importantly, it will describe how reverse differentiation could have more impact across a much wider range of applications.

Backwards differentiation has been used in four main ways known to me:

1. In automatic differentiation (AD), a field well covered by the rest of this book. In AD, reverse di.erentiation is usually called the "reverse method" or the "adjoint method." However, the term "adjoint method" has actually been used to describe two different generations of methods. Only the newer generation, which Griewank has called "the true adjoint method," captures the full power of the method.

2. In neural networks, where it is normally called "backpropagation" [532, 541, 544]. Surveys have shown that backpropagation is used in a majority of the real-world applications of artificial neural networks (ANNs). This is the stream of work that I know best, and may even claim to have originated.

3. In hand-coded "adjoint" or "dual" subroutines developed for specific models and applications, e.g., [534, 535, 539, 540].

4. In circuit design. Because the calculations of the reverse method are all local, it is possible to insert circuits onto a chip which calculate derivatives backwards physically on the same chip which calculates the quantit( ies) being differentiated. Professor Robert Newcomb at the University of Maryland, College Park, is one of the people who has implemented such "adjoint circuits."

Some of us believe that local calculations of this kind must exist in the brain, because the computational capabilities of the brain require some use of derivatives and because mechanisms have been found in the brain which fit this idea.

These four strands of research could benefit greatly from greater collaboration. For example – the AD community may well have the deepest understanding of how to actually calculate derivatives and to build robust dual subroutines, but the neural network community has worked hard to find many ways of using backpropagation in a wide variety of applications.

The gap between the AD community and the neural network community reminds me of a split I once saw between some people making aircraft engines and people making aircraft bodies.

Erscheint lt. Verlag 3.2.2006
Reihe/Serie Lecture Notes in Computational Science and Engineering
Lecture Notes in Computational Science and Engineering
Zusatzinfo XVIII, 370 p. 108 illus.
Verlagsort Berlin
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Software Entwicklung
Mathematik / Informatik Mathematik
Technik Elektrotechnik / Energietechnik
Schlagworte algorithms • automatic differentiation • Calculus • Modeling • Optimization • Quasi-Newton method • Sensitivity Analysis
ISBN-10 3-540-28438-9 / 3540284389
ISBN-13 978-3-540-28438-3 / 9783540284383
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 6,8 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Das umfassende Handbuch

von Jürgen Sieben

eBook Download (2023)
Rheinwerk Computing (Verlag)
CHF 87,80
Eine kompakte Einführung

von Brendan Burns; Joe Beda; Kelsey Hightower; Lachlan Evenson

eBook Download (2023)
dpunkt (Verlag)
CHF 31,15