Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Random Matrices -  Madan Lal Mehta

Random Matrices (eBook)

eBook Download: PDF
2004 | 3. Auflage
706 Seiten
Elsevier Science (Verlag)
978-0-08-047411-3 (ISBN)
Systemvoraussetzungen
113,85 inkl. MwSt
(CHF 109,95)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
"This book gives a coherent and detailed description of analytical methods devised to study random matrices. These methods are critical to the understanding of various fields in in mathematics and mathematical physics, such as nuclear excitations, ultrasonic resonances of structural materials, chaotic systems, the zeros of the Riemann and other zeta functions. More generally they apply to the characteristic energies of any sufficiently complicated system and which have found, since the publication of the second edition, many new applications in active research areas such as quantum gravity, traffic and communications networks or stock movement in the financial markets.

This revised and enlarged third edition reflects the latest developements in the field and convey a greater experience with results previously formulated. For example, the theory of skew-orthogoanl and bi-orthogonal polynomials, parallel to that of the widely known and used orthogonal polynomials, is explained here for the first time.

?Presentation of many new results in one place for the first time.
?First time coverage of skew-orthogonal and bi-orthogonal polynomials and their use in the evaluation of some multiple integrals.
?Fredholm determinants and Painlev? equations.
?The three Gaussian ensembles (unitary, orthogonal, and symplectic), their n-point correlations, spacing probabilities.
?Fredholm determinants and inverse scattering theory.
?Probability densities of random determinants."
Random Matrices gives a coherent and detailed description of analytical methods devised to study random matrices. These methods are critical to the understanding of various fields in in mathematics and mathematical physics, such as nuclear excitations, ultrasonic resonances of structural materials, chaotic systems, the zeros of the Riemann and other zeta functions. More generally they apply to the characteristic energies of any sufficiently complicated system and which have found, since the publication of the second edition, many new applications in active research areas such as quantum gravity, traffic and communications networks or stock movement in the financial markets. This revised and enlarged third edition reflects the latest developements in the field and convey a greater experience with results previously formulated. For example, the theory of skew-orthogoanl and bi-orthogonal polynomials, parallel to that of the widely known and used orthogonal polynomials, is explained here for the first time. - Presentation of many new results in one place for the first time- First time coverage of skew-orthogonal and bi-orthogonal polynomials and their use in the evaluation of some multiple integrals- Fredholm determinants and Painleve equations- The three Gaussian ensembles (unitary, orthogonal, and symplectic); their n-point correlations, spacing probabilities- Fredholm determinants and inverse scattering theory- Probability densities of random determinants

Front Cover 1
Random Matrices 4
Copyright Page 5
Contents 6
Preface to the Third Edition 14
Preface to the Second Edition 16
Preface to the First Edition 18
Chapter 1. Introduction 20
1.1. Random Matrices in Nuclear Physics 20
1.2. Random Matrices in Other Branches of Knowledge 24
1.3. A Summary of Statistical Facts about Nuclear Energy Levels 27
1.4. Definition of a Suitable Function for the Study of Level Correlations 29
1.5. Wigner Surmise 32
1.6. Electromagnetic Properties of Small Metallic Particles 34
1.7. Analysis of Experimental Nuclear Levels 35
1.8. The Zeros of the Riemann Zeta Function 35
1.9. Things Worth Consideration, But Not Treated in This Book 49
Chapter 2. Gaussian Ensembles. The Joint Probability Density Function for the Matrix Elements 52
2.1. Preliminaries 52
2.2. Time-Reversal Invariance 53
2.3. Gaussian Orthogonal Ensemble 55
2.4. Gaussian Symplectic Ensemble 57
2.5. Gaussian Unitary Ensemble 61
2.6. Joint Probability Density Function for the Matrix Elements 62
2.7. Gaussian Ensemble of Hermitian Matrices With Unequal Real and Imaginary Parts 67
2.8. Anti-Symmetric Hermitian Matrices 67
Summary of Chapter 2 68
Chapter 3. Gaussian Ensembles. The Joint Probability Density Function for the Eigenvalues 69
3.1. Orthogonal Ensemble 69
3.2. Symplectic Ensemble 73
3.3. Unitary Ensemble 75
3.4. Ensemble of Anti-Symmetric Hermitian Matrices 78
3.5. Gaussian Ensemble of Hermitian Matrices With Unequal Real and Imaginary Parts 79
3.6. Random Matrices and Information Theory 79
Summary of Chapter 3 81
Chapter 4. Gaussian Ensembles Level Density 82
4.1. The Partition Function 82
4.2. The Asymptotic Formula for the Level Density. Gaussian Ensembles 84
4.3. The Asymptotic Formula for the Level Density. Other Ensembles 86
Summary of Chapter 4 88
Chapter 5. Orthogonal, Skew-Orthogonal and Bi-Orthogonal Polynomials 90
5.1. Quaternions, Pfaffians, Determinants 91
5.2. Average Value of .N j=1 f (xj ) Orthogonal and Skew-Orthogonal Polynomials
5.3. Case ß = 2 Orthogonal Polynomials
5.4. Case ß = 4 Skew-Orthogonal Polynomials of Quaternion Type
5.5. Case ß = 1 Skew-Orthogonal Polynomials of Real Type
5.6. Average Value of .Nj=1 .(xj ,yj ) Bi-Orthogonal Polynomials
5.7. Correlation Functions 108
5.8. Proof of Theorem 5.7.1 112
5.9. Spacing Functions 120
5.10. Determinantal Representations 120
5.11. Integral Representations 122
5.12. Properties of the Zeros 125
5.13. Orthogonal Polynomials and the Riemann–Hilbert Problem 126
5.14. A Remark (Balian) 127
Summary of Chapter 5 127
Chapter 6. Gaussian Unitary Ensemble 129
6.1. Generalities 130
6.2. The n-Point Correlation Function 137
6.3. Level Spacings 141
6.4. Several Consecutive Spacings 146
6.5. Some Remarks 153
Summary of Chapter 6 163
Chapter 7. Gaussian Orthogonal Ensemble 165
7.1. Generalities 166
7.2. Correlation and Cluster Functions 167
7.3. Level Spacings. Integration Over Alternate Variables 173
7.4. Several Consecutive Spacings: n = 2r 176
7.5. Several Consecutive Spacings: n = 2r – 1 181
7.6. Bounds for the Distribution Function of the Spacings 187
Summary of Chapter 7 191
Chapter 8. Gaussian Symplectic Ensemble 194
8.1. A Quaternion Determinant 194
8.2. Correlation and Cluster Functions 196
8.3. Level Spacings 198
Summary of Chapter 8 200
Chapter 9. Gaussian Ensembles: Brownian Motion Model 201
9.1. Stationary Ensembles 201
9.2. Nonstationary Ensembles 202
9.3. Some Ensemble Averages 206
Summary of Chapter 9 208
Chapter 10. Circular Ensembles 210
10.1. Orthogonal Ensemble 211
10.2. Symplectic Ensemble 213
10.3. Unitary Ensemble 215
10.4. The Joint Probability Density of the Eigenvalues 216
Summary of Chapter 10 220
Chapter 11. Circular Ensembles (Continued) 222
11.1. Unitary Ensemble. Correlation and Cluster Functions 222
11.2. Unitary Ensemble. Level Spacings 224
11.3. Orthogonal Ensemble. Correlation and Cluster Functions 226
11.4. Orthogonal Ensemble. Level Spacings 232
11.5. Symplectic Ensemble. Correlation and Cluster Functions 235
11.6. Relation Between Orthogonal and Symplectic Ensembles 237
11.7. Symplectic Ensemble. Level Spacings 238
11.8. Brownian Motion Model 240
Summary of Chapter 11 242
Chapter 12. Circular Ensembles. Thermodynamics 243
12.1. The Partition Function 243
12.2. Thermodynamic Quantities 246
12.3. Statistical Interpretation of U and C 248
12.4. Continuum Model for the Spacing Distribution 250
Summary of Chapter 12 255
Chapter 13. Gaussian Ensemble of Anti-Symmetric Hermitian Matrices 256
13.1. Level Density. Correlation Functions 256
13.2. Level Spacings 259
Summary of Chapter 13 262
Chapter 14. A Gaussian Ensemble of Hermitian Matrices With Unequal Real and Imaginary Parts 263
14.1. Summary of Results. Matrix Ensembles From GOE to GUE and Beyond 264
14.2. Matrix Ensembles From GSE to GUE and Beyond 269
14.3. Joint Probability Density for the Eigenvalues 273
14.4. Correlation and Cluster Functions 282
Summary of Chapter 14 283
Chapter 15. Matrices With Gaussian Element Densities But With No Unitary or Hermitian Conditions Imposed 285
15.1. Complex Matrices 285
15.2. Quaternion Matrices 292
15.3. Real Matrices 298
15.4. Determinants: Probability Densities 300
Summary of Chapter 15 305
Chapter 16. Statistical Analysis of a Level-Sequence 306
16.1. Linear Statistic or the Number Variance 309
16.2. Least Square Statistic 313
16.3. Energy Statistic 317
16.4. Covariance of Two Consecutive Spacings 320
16.5. The F-Statistic 321
16.6. The A-Statistic 322
16.7. Statistics Involving Three and Four Level Correlations 322
16.8. Other Statistics 326
Summary of Chapter 16 327
Chapter 17. Selberg's Integral and Its Consequences 328
17.1. Selberg's Integral 328
17.2. Selberg's Proof of Eq. (17.1.3) 330
17.3. Aomoto’s Proof of Eqs. (17.1.4) and (17.1.3) 334
17.4. Other Averages 337
17.5. Other Forms of Selberg’s Integral 337
17.6. Some Consequences of Selberg’s Integral 339
17.7. Normalization Constant for the Circular Ensembles 342
17.8. Averages With Laguerre or Hermite Weights 342
17.9. Connection With Finite Reflection Groups 344
17.10. A Second Generalization of the Beta Integral 346
17.11. Some Related Difficult Integrals 348
Summary to Chapter 17 353
Chapter 18. Asymptotic Behaviour of Eß (0, s) by Inverse Scattering 354
18.1. Asymptotics of .n(t) 355
18.2. Asymptotics of Toeplitz Determinants 358
18.3. Fredholm Determinants and the Inverse Scattering Theory 359
18.4. Application of the Gel’fand–Levitan Method 361
18.5. Application of the Marchenko Method 366
18.6. Asymptotic Expansions 369
Summary of Chapter 18 372
Chapter 19. Matrix Ensembles and Classical Orthogonal Polynomials 373
19.1. Unitary Ensemble 374
19.2. Orthogonal Ensemble 376
19.3. Symplectic Ensemble 380
19.4. Ensembles With Other Weights 382
19.5. Conclusion 382
Summary of Chapter 19 383
Chapter 20. Level Spacing Functions Eß(r, s) Inter-relations and Power Series Expansions
20.1. Three Sets of Spacing Functions Their Inter-Relations
20.2. Relation Between Odd and Even Solutions of Eq. (20.1.13) 387
20.3. Relation Between F1(z, s) and F+-(z, s) 390
20.4. Relation Between F4(z, s) and F+-(z, s) 394
20.5. Power Series Expansions of Eß(r, s) 395
Summary of Chapter 20 400
Chapter 21. Fredholm Determinants and Painlev¨¦ Equations 401
21.1. Introduction 401
21.2. Proof of Eqs. (21.1.11)–(21.1.17) 404
21.3. Differential Equations for the Functions A, B and S 413
21.4. Asymptotic Expansions for Large Positive t 415
21.5. Fifth and Third Painlevé Transcendents 419
21.6. Solution of Eq. (21.3.6) for Large t 425
Summary of Chapter 21 427
Chapter 22. Moments of the Characteristic Polynomial in the Three Ensembles of Random Matrices 428
22.1. Introduction 428
22.2. Calculation of Iß(n,m x)
22.3. Special Case of the Gaussian Weight 438
22.4. Average Value of .m i=1 det(xi I – A) .l j=1 det(zj I – A)-1 for the Unitary Ensemble 440
Summary of Chapter 22 443
Chapter 23. Hermitian Matrices Coupled in a Chain 445
23.1. General Correlation Function 447
23.2. Proof of Theorem 23.1.1 449
23.3. Spacing Functions 454
23.4. The Generating Function R(z1, I1 ...
23.5. The Zeros of the Bi-Orthogonal Polynomials 460
Summary of Chapter 23 467
Chapter 24. Gaussian Ensembles. Edge of the Spectrum 468
24.1. Level Density Near the Inflection Point 469
24.2. Spacing Functions 471
24.3. Differential Equations Painlevé
Summary to Chapter 24 477
Chapter 25. Random Permutations, Circular Unitary Ensemble (CUE) and Gaussian Unitary Ensemble (GUE) 479
25.1. Longest Increasing Subsequences in Random Permutations 479
25.2. Random Permutations and the Circular Unitary Ensemble 480
25.3. Robinson–Schensted Correspondence 482
25.4. Random Permutations and GUE 487
Summary of Chapter 25 487
Chapter 26. Probability Densities of the Determinants Gaussian Ensembles
26.1. Introduction 488
26.2. Gaussian Unitary Ensemble 492
26.3. Gaussian Symplectic Ensemble 496
26.4. Gaussian Orthogonal Ensemble 499
26.5. Gaussian Orthogonal Ensemble. Case n = 2m +1Odd 501
26.6. Gaussian Orthogonal Ensemble. Case n = 2m Even 502
Summary of Chapter 26 505
Chapter 27. Restricted Trace Ensembles 506
27.1. Fixed Trace Ensemble Equivalence of Moments
27.2. Probability Density of the Determinant 509
27.3. Bounded Trace Ensembles 511
Summary of Chapter 27 512
Appendices 513
A.1. Numerical Evidence in Favor of Conjectures 1.2.1 and 1.2.2 513
A.2. The Probability Density of the Spacings Resulting from a Random Superposition of n Unrelated Sequences of Energy Levels 514
A.3. Some Properties of Hermitian, Unitary, Symmetric or Self-Dual Matrices 517
A.4. Counting the Dimensions of TßG and T'ßG 518
A.5. An Integral Over the Unitary Group 519
A.6. The Minimum Value of W 523
A.7. Relation Between Rn, Tn and E(n 2.)
A.8. Relation Between E(n s), F(n
A.9. The Limit of .N-1 0 fj(x) 2 529
A.10. The Limits of .0 N–1 0 fj(x)fj(y) 530
A.11. The Fourier Transforms of the Two-Point Cluster Functions 533
A.12. Some Applications of Gram’s Formula 535
A.13. Power Series Expansions of Eigenvalues, of Spheroidal Functions and of Various Probabilities 536
A.14. Numerical Tables of .j(s), bj (s) and Eß (n s) for ß - 1, 2 and 4
A.15. Numerical Values of Eß(0 s), .ß(s) and pß(0
A.16. Proof of Eqs. (21.1.11)–(21.1.16), (24.3.11), (24.3.15) and (24.3.20) Using a Compact Notation 544
A.17. Use of Pfaffians in Some Multiple Integrals 555
A.18. Calculation of Certain Determinants 557
A.19. Power-Series Expansion of Im(.), Eq. (7.6.12) 569
A.20. Proof of the Inequalities (7.6.15) 570
A.21. Proof of Eqs. (10.1.11) and (10.2.11) 571
A.22. Proof of the Inequality (12.1.5) 572
A.23. Good’s Proof of Eq. (12.1.16) 573
A.24. Some Recurrence Relations and Integrals Used in Chapter 14 574
A.25. Normalization Integral, Eq. (14.1.11) 579
A.26. Another Normalization Integral, Eq. (14.2.9) 583
A.27. Joint Probability Density as a Determinant of a Self-Dual Quaternion Matrix. Section 14.4, Eqs. (14.4.2) and (14.4.5) 584
A.28. Verification of Eq. (14.4.3) 588
A.29. The Limits of JN(x, y) and DN (x, y) as N—> 8
A.30. Evaluation of the Integral (15.1.9) for Complex Matrices 592
A.31. A Few Remarks About the Eigenvalues of a Quaternion Real Matrix 596
A.32. Evaluation of the Integral Corresponding to (15.2.9) 598
A.33. Another Proof of Eqs. (15.1.10) and (15.2.10) 601
A.34. Proof of Eq. (15.2.38) 603
A.35. Partial Triangulation of a Matrix 604
A.36. Average Number of Real Eigenvalues of a Real Gaussian Random Matrix 606
A.37. Probability Density of the Eigenvalues of a Real Random Matrix When k of Its Eigenvalues Are Real 607
A.38. Variance of the Number Statistic. Section 16.1 613
A.39. Optimum Linear Statistic. Section 16.1 620
A.40. Mean Value of .. Section 16.2 622
A.41. Tables of Functions Bß(x1,x2) and Pß(x1,x2) for ß =1 and 2 626
A.42. Sums ajm(+-) and a(2)jn jn for n = 1, 2 and 3, Section 20.5 630
A.43. Values of a(+) jn, a jn(-), and a (2) jn, for Low Values of j and n 632
A.44. A Personal Recollection 638
A.45. About Painlevé Transcendents 639
A.46. Inverse Power Series Expansions of Sn(t), An(t), Bn(t), etc 648
A.47. Table of Values of an in Eq. (21.4.6) for Small Values of n 651
A.48. Some Remarks About the Numerical Computations 652
A.49. Convolution of Two Gaussian Kernels 653
A.50. Method of the Change of Variables.Wick’s Theorem 655
A.51. Some Remarks About the Integral I (k, n), Eq. (25.2.12) 657
A.52. Meijer G-functions for Small and Large Values of the Variable 659
A.53. About Binary Quadratic Forms 661
Notes 664
References 674
Author Index 699
Subject Index 703

1

Introduction


Random matrices first appeared in mathematical statistics in the 1930s but did not attract much attention at the time. In the theory of random matrices one is concerned with the following question. Consider a large matrix whose elements are random variables with given probability laws. Then what can one say about the probabilities of a few of its eigenvalues or of a few of its eigenvectors? This question is of pertinence for the understanding of the statistical behaviour of slow neutron resonances in nuclear physics, where it was proposed in the 1950s and intensively studied by the physicists. Later the question gained importance in other areas of physics and mathematics, such as the characterization of chaotic systems, elastodynamic properties of structural materials, conductivity in disordered metals, the distribution of the values of the Riemann zeta function on the critical line, enumeration of permutations having certain particularities, counting of certain knots and links, quantum gravity, quantum chromo dynamics, string theory, and others (cf. J. Phys. A 36 (2003), special issue: random matrices). The reasons of this pertinence are not yet clear. The impression is that some sort of a law of large numbers is in the back ground. In this chapter we will try to give reasons why one should study random matrices.

1.1 Random Matrices in Nuclear Physics


Figure 1.1 shows a typical graph of slow neutron resonances. There one sees various peaks with different widths and heights located at various places. Do they have any definite statistical pattern? The locations of the peaks are called nuclear energy levels, their widths the neutron widths and their heights are called the transition strengths.

Figure 1.1 Slow neutron resonance cross-sections on thorium 232 and uranium 238 nuclei. Reprinted with permission from The American Physical Society, Rahn et al., Neutron resonance spectroscopy, X, Phys. Rev. C 6, 1854-1869 (1972).

The experimental nuclear physicists have collected vast amounts of data concerning the excitation spectra of various nuclei such as shown on Figure 1.1 (Garg et al., 1964, where a detailed description of the experimental work on thorium and uranium energy levels is given; (Rosen et al., 1960; Camarda et al., 1973; Liou et al., 1972b). The ground state and low lying excited states have been impressively explained in terms of an independent particle model where the nucleons are supposed to move freely in an average potential well (Mayer and Jensen, 1955; Kisslinger and Sorenson, 1960). As the excitation energy increases, more and more nucleons are thrown out of the main body of the nucleus, and the approximation of replacing their complicated interactions with an average potential becomes more and more inaccurate. At still higher excitations the nuclear states are so dense and the intermixing is so strong that it is a hopeless task to try to explain the individual states; but when the complications increase beyond a certain point the situation becomes hopeful again, for we are no longer required to explain the characteristics of every individual state but only their average properties, which is much simpler.

The statistical behaviour of the various energy levels is of prime importance in the study of nuclear reactions. In fact, nuclear reactions may be put into two major classes—fast and slow. In the first case a typical reaction time is of the order of the time taken by the incident nucleon to pass through the nucleus. The wavelength of the incident nucleon is much smaller than the nuclear dimensions, and the time it spends inside the nucleus is so short that it interacts with only a few nucleons inside the nucleus. A typical example is the head-on collision with one nucleon in which the incident nucleon hits and ejects a nucleon, thus giving it almost all its momentum and energy. Consequently in such cases the coherence and the interference effects between incoming and outgoing nucleons is strong.

Another extreme is provided by the slow reactions in which the typical reaction times are two or three orders of magnitude larger. The incident nucleon is trapped and all its energy and momentum are quickly distributed among the various constituents of the target nucleus. It takes a long time before enough energy is again concentrated on a single nucleon to eject it. The compound nucleus lives long enough to forget the manner of its formation, and the subsequent decay is therefore independent of the way in which it was formed.

In the slow reactions, unless the energy of the incident neutron is very sharply defined, a large number of neighboring energy levels of the compound nucleus are involved, hence the importance of an investigation of their average properties, such as the distribution of neutron and radiation widths, level spacings, and fission widths. It is natural that such phenomena, which result from complicated many body interactions, will give rise to statistical theories. We shall concentrate mainly on the average properties of nuclear levels such as level spacings.

According to quantum mechanics, the energy levels of a system are supposed to be described by the eigenvalues of a Hermitian operator H, called the Hamiltonian. The energy level scheme of a system consists in general of a continuum and a certain, perhaps a large, number of discrete levels. The Hamiltonian of the system should have the same eigenvalue structure and therefore must operate in an infinite-dimensional Hilbert space. To avoid the difficulty of working with an infinite-dimensional Hilbert space, we make approximations amounting to a truncation keeping only the part of the Hilbert space that is relevant to the problem at hand and either forgetting about the rest or taking its effect in an approximate manner on the part considered. Because we are interested in the discrete part of the energy level schemes of various quantum systems, we approximate the true Hilbert space by one having a finite, though large, number of dimensions. Choosing a basis in this space, we represent our Hamiltonians by finite dimensional matrices. If we can solve the eigenvalue equation

we shall get all the eigenvalues and eigenfunctions of the system, and any physical information can then be deduced, in principle, from this knowledge. In the case of the nucleus, however, there are two difficulties. First, we do not know the Hamiltonian and, second, even if we did, it would be far too complicated to attempt to solve the corresponding equation.

Therefore from the very beginning we shall be making statistical hypotheses on H, compatible with the general symmetry properties. Choosing a complete set of functions as basis, we represent the Hamiltonian operators H as matrices. The elements of these matrices are random variables whose distributions are restricted only by the general symmetry properties we might impose on the ensemble of operators. And the problem is to get information on the behaviour of its eigenvalues. “The statistical theory will not predict the detailed level sequence of any one nucleus, but it will describe the general appearance and the degree of irregularity of the level structure that is expected to occur in any nucleus which is too complicated to be understood in detail” (Dyson, 1962a).

In classical statistical mechanics a system may be in any of the many possible states, but one does not ask in which particular state a system is. Here we shall renounce knowledge of the nature of the system itself. “We picture a complex nucleus as a black box in which a large number of particles are interacting according to unknown laws. As in orthodox statistical mechanics we shall consider an ensemble of Hamiltonians, each of which could describe a different nucleus. There is a strong logical expectation, though no rigorous mathematical proof, that an ensemble average will correctly describe the behaviour of one particular system which is under observation. The expectation is strong, because the system might be one of a huge variety of systems, and very few of them will deviate much from a properly chosen ensemble average. On the other hand, our assumption that the ensemble average correctly describes a particular system, say the U239 nucleus, is not compelling. In fact, if this particular nucleus turns out to be far removed from the ensemble average, it will show that the U239 Hamiltonian possesses specific properties of which we are not aware. This, then will prompt one to try to discover the nature and the origin of these properties” (Dyson, 1962b).

Wigner was the first to propose in this connection the hypothesis alluded to, namely that the local statistical behaviour of levels in a simple sequence is identical with the eigenvalues of a random matrix. A simple sequence is one whose levels all have the same spin, parity, and other strictly conserved quantities, if any, which result from the symmetry of the system. The corresponding symmetry requirements are to be imposed on the random matrix. There being no other restriction on the matrix, its elements are taken to be random with, say, a Gaussian distribution. Porter and Rosenzweig (1960a) were the early workers in the field who analyzed the nuclear experimental data made available by Harvey and Hughes (1958), Rosen et al., (1960) and the atomic data compiled by Moore (1949, 1958). They found that the occurrence of two levels close to each other in a...

Erscheint lt. Verlag 6.10.2004
Sprache englisch
Themenwelt Sachbuch/Ratgeber
Mathematik / Informatik Mathematik Algebra
Mathematik / Informatik Mathematik Statistik
Technik
ISBN-10 0-08-047411-X / 008047411X
ISBN-13 978-0-08-047411-3 / 9780080474113
Haben Sie eine Frage zum Produkt?
PDFPDF (Adobe DRM)

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich