Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Advances in Geophysics

Advances in Geophysics (eBook)

eBook Download: PDF | EPUB
2014 | 1. Auflage
136 Seiten
Elsevier Science (Verlag)
978-0-12-800371-8 (ISBN)
Systemvoraussetzungen
Systemvoraussetzungen
170,00 inkl. MwSt
(CHF 165,95)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
The critically acclaimed serialized review journal for over 50 years, Advances in Geophysics is a highly respected publication in the field of geophysics. Since 1952, each volume has been eagerly awaited, frequently consulted, and praised by researchers and reviewers alike. Now in its 55th volume, it contains much material still relevant today--truly an essential publication for researchers in all fields of geophysics. - Contributions from leading authorities - Informs and updates on all the latest developments in the field
The critically acclaimed serialized review journal for over 50 years, Advances in Geophysics is a highly respected publication in the field of geophysics. Since 1952, each volume has been eagerly awaited, frequently consulted, and praised by researchers and reviewers alike. Now in its 55th volume, it contains much material still relevant today--truly an essential publication for researchers in all fields of geophysics. - Contributions from leading authorities- Informs and updates on all the latest developments in the field

Front Cover 1
Advances in Geophysics 2
Advances in Geophysics 4
Copyright 5
Contents 6
Contributors 8
Seismic Tomography and the Assessment of Uncertainty 10
1. Introduction 11
1.1 Motivation 11
1.2 Historical Perspective 14
1.3 Uncertainty in the Age of Big Data 21
2. Nonuniqueness in Seismic Tomography 23
2.1 Data Coverage 24
2.2 Data Noise 26
2.3 The Parameterization Problem 30
2.4 The Data Prediction Problem 32
2.5 The Inverse Problem 34
3. Practical Assessment Methods 37
3.1 Covariance and Resolution 37
3.2 Jackknife and Bootstrap 45
3.3 Synthetic Reconstruction Tests 47
3.4 Linear and Iterative Nonlinear Sampling 48
3.5 Fully Nonlinear Sampling 53
4. Case Studies 55
4.1 Synthetic Reconstruction Test: Teleseismic Tomography Example 55
4.2 Iterative Nonlinear Sampling: Surface Wave Tomography Example 61
4.3 Transdimensional Inversion: Surface Wave Tomography Example 64
4.4 Full Waveform Inversion: Resolution Analysis Based on Second-Order Adjoints 69
5. Concluding Remarks 73
Acknowledgments 74
References 75
Two. El Niño/Southern Oscillation and Selected Environmental Consequences 86
1 Introduction 87
2 Fundamentals of El NiÑo/Southern Oscillation 88
3 What Triggers El NiÑo/Southern Oscillation? 96
4 El NiÑo/Southern Oscillation in the Past 99
5 El NiÑo/Southern Oscillation versus Selected Geophysical Processes and Their Predictions 104
5.1 Earth Orientation and ENSO 105
5.2 Climatological and Hydrological ENSO Teleconnections 109
5.3 Sea Level Change and ENSO 115
5.3.1 Global and Local Mean Sea Level 119
5.3.2 Site-Specific Sea Level 121
6 Concluding Remarks 123
Acknowledgments 124
References 125
Index 132

2. Nonuniqueness in Seismic Tomography


Nonuniqueness in seismic tomography refers to when more than one model satisfies the observations, and is a consequence of the ill-posed nature of the problem. The reason that this arises is succinctly explained by Snieder (1991): “The inverse problem where one wants to estimate a continuous model with infinitely many degrees of freedom from a finite data set is necessarily ill-posed.” Although this appears to be indisputable, it is nonetheless at odds with a statement made by Aki and Lee (1976) in one of the first papers on seismic tomography when they evaluate the results of their inversion of local earthquake data: “Thus, we confirm our earlier contention that we can obtain a unique solution of our problem when sufficient data are available.” Ostensibly, this might seem contradictory, but in reality it is merely a case of viewing the problem at different stages of the solution process. From the outset, all seismic data sets are finite, so it follows that any number of models, with no restrictions on their flexibility, could be conceived that satisfy the observations to the same level. However, if we impose a limit on the minimum scale length of the model, for example, based on the dominant wavelength of the data that is being exploited (with the argument that the observables are insensitive to variations of smaller scale length), then the range of data-satisfying models will be dramatically reduced. Taking it a step further, if we now define an objective function to which the inverse problem is tied, then a unique solution may be possible, particularly if the assumption of linearization is imposed. The above statement made by Aki and Lee (1976) is essentially tied to an inverse problem that has been reduced to this state. However, in most seismic tomography problems, the presence of (often poorly constrained) data noise means that solution uniqueness is difficult to achieve, even if a variety of limiting assumptions are imposed on the permissible variations in structure.
Below, a brief description is provided of the various factors that play a role in constraining the solution of an inverse problem in seismic tomography.

2.1. Data Coverage


Increasing the volume of available data by adding contributions from additional sources or receivers will in many cases produce a better outcome in seismic tomography. However, it is well known that adding more data does not necessarily result in a better constrained inverse problem. This is illustrated by the simple case where a ray path traverses two blocks with different velocities. When a second ray passes through the same blocks with the same ratio of path lengths, then the two linear equations relating travel time and slowness are linearly dependent and so the new ray adds no new information to the inverse problem. Although there are a variety of tomography problems where this issue arises, it is particularly notable when earthquake sources are used (Fishwick & Rawlinson, 2012; Rawlinson, Kennett, Vanacore, Glen, & Fishwick, 2011). In such cases, earthquakes tend to cluster around seismogenic regions (e.g., subduction zones, active faults), and after a period of time most subsequent earthquakes occur within the neighborhood of previous earthquakes, such that they contribute little new structural information in the recorded seismogram.
Figure 3 shows a simple synthetic example, based on Figure 1, which demonstrates this concept. Figure 3(a) is a reconstruction of Figure 1(a), based on a constant velocity starting model, which uses the source–receiver travel times of the paths shown in Figure 1(b) (see Rawlinson et al., 2008; for an explanation of the iterative nonlinear inversion scheme). Figure 3(b) is a repeat of this experiment but with a travel time data set that is twice the size; this is simply accomplished by repeating each source location with a 0.3° perturbation in latitude and longitude. Despite the significant increase in the number of data, the recovered model is virtually identical. Given that the new source locations are perturbed by at least 2% of the model width, one might have expected the reconstruction in Figure 3(b) to be slightly better. However, given the minimum scale length of the anomalies, which is around 5% of the model width, and the nonlinearity of the problem, the lack of improvement is hardly surprising. In the latter case, since first-arrival rays are attracted to higher velocity regions, rays from nearby sources tend to bunch together, and do little to help constrain structure. In applications involving real observations, the presence of noise, provided that it is uncorrelated, should mean that adding more data, even if it samples identical along-path structure, will improve the result due to an “averaging out” of the noise. This is the same philosophy behind data binning, which is often done prior to teleseismic, regional, or global tomography (e.g., Rawlinson & Fishwick, 2012).

Figure 3 Two synthetic reconstruction tests based on the structure and path coverage shown in Figure 1. The initial model has a uniform velocity of 3.0 km/s. (a) Data set consists of 280 travel times between 20 sources and 14 receivers; (b) Data set consists of 560 travel times between 40 sources and 14 receivers. For both (a) and (b), the top plot shows the reconstructed model and the bottom plot shows the ray paths superimposed on the input model.
Ray coverage or density maps are often used in seismic tomography to provide insight into the resolving power of seismic data and the quality of a reconstruction (Bijwaard & Spakman, 2000; Nielsen, Thybo, & Solodilov, 1999Ramachandran, Hyndman, & Brocher, 2006; Walck & Clayton, 1987). However, at best they are an indirect tool, and have the potential to be misleading. A variant of the ray coverage or density map is to instead plot some measure of the sensitivity of the observables with respect to the model parameters (Chen and Jordan, 2007; Tape et al., 2010). Figure 4 shows the normalized cumulative sensitivity (obtained by summing the Frèchet derivatives at each control node and dividing by the largest value) of the travel time data set in Figure 1; here the underlying grid is interpolated using a smooth mosaic of cubic B-spline functions, which is why the sensitivity plot is smooth. However, just because the data are sensitive to a change in the value of a parameter does not automatically mean that the parameter is well resolved. For example, if a unidirectional bundle of rays traverses a pair of cells, each ray travel time will vary if the velocity of either of the cells is changed, but the data cannot discriminate between a change made to the velocity of one or the other of the cells.

Figure 4 Normalized cumulative sensitivity of the travel time data set with respect to the model parameterization for the synthetic travel time data set shown in Figure 1.

2.2. Data Noise


Noise is ubiquitous to all seismic data, and is often very difficult to accurately quantify. For example, with manual picking of phases, it is common for even experienced analysts to disagree on onset times (Leonard, 2000), let alone some measure of picking uncertainty. Automated picking algorithms (Allen, 1982; Di Stefano et al., 2006; Vassallo, Satriano, & Lomax, 2012; Wang & Houseman, 1997) have the potential to offer more rigorous and consistent estimates of uncertainty; for example, Di Stefano et al. (2006) automatically compute arrival time uncertainty using a quality-weighting algorithm that takes into account waveform sampling rate, spectral density analysis, and signal to noise ratio. However, these estimates are calibrated using a series of reference picks and error estimates provided by the user. Picking methods that use some measure of waveform coherence (Chevrot, 2002; Rawlinson & Kennett, 2004; VanDecar & Crosson, 1990) have the potential to produce accurate estimates of relative onset times, and can yield estimates of uncertainty. In the case of Chevrot (2002), picking error is determined by computing the correlation coefficient between each seismic waveform and the optimal waveform (determined by simulated annealing), and comparing the result to the autocorrelation of the optimal waveform; the point where the two correlation functions intersect gives the time delay error. While this may produce good relative estimates of picking uncertainty, it is unclear whether the absolute values are very meaningful. In full waveform tomography, it is the seismogram itself that represents the data, so no explicit picking of phases is usually required. However, particular care is required as to how waveform misfit is defined if imaging artifacts caused by the presence of data noise are to be minimized (Bozdaǧ, Trampert, & Tromp, 2011). Since noise-induced measurement uncertainties are almost impossible to assess quantitatively for complete waveforms, full waveform inversion mostly operates with data characterized by high signal to noise ratios (e.g., Chen et al., 2007; Fichtner et al., 2009; Tape et al., 2010).
It is clear that the presence of data noise is unavoidable in seismic tomography, so it remains to be seen how it influences the analysis of uncertainty in seismic tomography. In general, as the level of data noise increases, the range of data-fitting models increases. Thus, in...

Erscheint lt. Verlag 25.11.2014
Mitarbeit Herausgeber (Serie): Renata Dmowska
Sprache englisch
Themenwelt Naturwissenschaften Geowissenschaften Geologie
Naturwissenschaften Geowissenschaften Geophysik
Naturwissenschaften Physik / Astronomie
Technik
ISBN-10 0-12-800371-5 / 0128003715
ISBN-13 978-0-12-800371-8 / 9780128003718
Haben Sie eine Frage zum Produkt?
PDFPDF (Adobe DRM)
Größe: 25,3 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

EPUBEPUB (Adobe DRM)
Größe: 7,9 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich