Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Entropy and Information Theory - Robert M. Gray

Entropy and Information Theory (eBook)

(Autor)

eBook Download: PDF
2011 | 2nd ed. 2011
XXVII, 409 Seiten
Springer US (Verlag)
978-1-4419-7970-4 (ISBN)
Systemvoraussetzungen
171,19 inkl. MwSt
(CHF 167,25)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition:

  • Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes
  • Expanded discussion of results from ergodic theory relevant to information theory
  • Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources
  • New material on trading off information and distortion, including the Marton inequality
  • New material on the properties of optimal and asymptotically optimal source codes
  • New material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.



Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.

Entropy and Information Theory 3
Preface 7
Contents 13
Introduction 17
Chapter 1 Information Sources 29
1.1 Probability Spaces and Random Variables 29
1.2 Random Processes and Dynamical Systems 33
1.3 Distributions 35
1.4 Standard Alphabets 40
1.5 Expectation 41
1.6 Asymptotic Mean Stationarity 44
1.7 Ergodic Properties 45
Chapter 2 Pair Processes: Channels, Codes, andCouplings 48
2.1 Pair Processes 48
2.2 Channels 49
2.3 Stationarity Properties of Channels 52
2.4 Extremes: Noiseless and Completely Random Channels 56
Noiseless Channels 56
Completely Random Channels 56
2.5 Deterministic Channels and Sequence Coders 57
2.6 Stationary and Sliding-Block Codes 58
Finite-length Sliding-Block Codes 60
Sliding-Block Codes and Partitions 62
B-Processes 62
2.7 Block Codes 64
Block Independent Processes 64
Sliding-Block vs. Block Codes 64
2.8 Random Punctuation Sequences 65
2.9 Memoryless Channels 69
2.10 Finite-Memory Channels 69
2.11 Output Mixing Channels 70
2.12 Block Independent Channels 72
2.13 Conditionally Block Independent Channels 73
2.14 Stationarizing Block Independent Channels 73
2.15 Primitive Channels 75
2.16 Additive Noise Channels 76
2.17 Markov Channels 76
2.18 Finite-State Channels and Codes 77
2.19 Cascade Channels 78
2.20 Communication Systems 79
2.21 Couplings 79
2.22 Block to Sliding-Block: The Rohlin-Kakutani Theorem 80
Partitions 82
Gadgets 83
Strengthened Rohlin-Kakutani Theorem 84
Chapter 3 Entropy 88
3.1 Entropy and Entropy Rate 88
3.2 Divergence Inequality and Relative Entropy 92
3.3 Basic Properties of Entropy 96
Concavity of Entropy 99
Convexity of Divergence 101
Entropy and Binomial Sums 101
Variational Description of Divergence 103
3.4 Entropy Rate 105
3.5 Relative Entropy Rate 108
3.6 Conditional Entropy and Mutual Information 109
3.7 Entropy Rate Revisited 117
3.8 Markov Approximations 118
3.9 Relative Entropy Densities 120
Chapter 4 The Entropy Ergodic Theorem 123
4.1 History 123
4.2 Stationary Ergodic Sources 126
4.3 Stationary Nonergodic Sources 132
4.4 AMS Sources 136
4.5 The Asymptotic Equipartition Property 140
Chapter 5 Distortion and Approximation 142
5.1 Distortion Measures 142
5.2 Fidelity Criteria 145
5.3 Average Limiting Distortion 146
5.4 Communications Systems Performance 148
5.5 Optimal Performance 149
5.6 Code Approximation 149
5.7 Approximating Random Vectors and Processes 154
5.8 The Monge/Kantorovich/Vasershtein Distance 157
5.9 Variation and Distribution Distance 157
5.10 Coupling Discrete Spaces with the Hamming Distance 159
5.11 Process Distance and Approximation 160
The dp-distance 162
Evaluating Process Distortion 166
5.12 Source Approximation and Codes 166
5.13 d-bar Continuous Channels 167
Chapter 6 Distortion and Entropy 172
6.1 The Fano Inequality 172
6.2 Code Approximation and Entropy Rate 175
Dynamical Systems and Random Processes 176
6.3 Pinsker’s and Marton’s Inequalities 177
6.4 Entropy and Isomorphism 181
Isomorphic Measurable Spaces 182
Isomorphic Probability Spaces 182
Isomorphism Mod 0 183
Isomorphic Dynamical Systems 183
Isomorphic Random Processes 183
6.5 Almost Lossless Source Coding 185
Almost-Lossless Block Codes 186
Asynchronous Block Code 188
Sliding-Block Code 190
6.6 Asymptotically Optimal Almost Lossless Codes 193
6.7 Modeling and Simulation 194
Chapter 7 Relative Entropy 197
7.1 Divergence 197
Variational Description of Divergence 211
7.2 Conditional Relative Entropy 213
Generalized Conditional Relative Entropy 224
7.3 Limiting Entropy Densities 226
7.4 Information for General Alphabets 228
7.5 Convergence Results 240
Chapter 8 Information Rates 243
8.1 Information Rates for Finite Alphabets 243
8.2 Information Rates for General Alphabets 245
8.3 A Mean Ergodic Theorem for Densities 249
8.4 Information Rates of Stationary Processes 251
8.5 The Data Processing Theorem 258
8.6 Memoryless Channels and Sources 259
Chapter 9 Distortion and Information 261
9.1 The Shannon Distortion-Rate Function 261
9.2 Basic Properties 263
IID Sources 265
9.3 Process Definitions of the Distortion-Rate Function 266
9.4 The Distortion-Rate Function as a Lower Bound 274
9.5 Evaluating the Rate-Distortion Function 276
Support of Shannon Optimal Distributions 286
Chapter 10 Relative Entropy Rates 288
10.1 Relative Entropy Densities and Rates 288
10.2 Markov Dominating Measures 291
10.3 Stationary Processes 295
10.4 Mean Ergodic Theorems 298
Finite Alphabets 298
Standard Alphabets 301
Chapter 11 Ergodic Theorems for Densities 303
11.1 Stationary Ergodic Sources 303
11.2 Stationary Nonergodic Sources 308
11.3 AMS Sources 312
11.4 Ergodic Theorems for Information Densities. 315
Chapter 12 Source Coding Theorems 317
12.1 Source Coding and Channel Coding 317
12.2 Block Source Codes for AMS Sources 318
Reference Letters 321
Performance and Distortion-Rate Functions 322
12.3 Block Source Code Mismatch 329
12.4 Block Coding Stationary Sources 332
12.5 Block Coding AMS Ergodic Sources 334
12.6 Subadditive Fidelity Criteria 341
12.7 Asynchronous Block Codes 343
12.8 Sliding-Block Source Codes 345
12.9 A Geometric Interpretation 355
Chapter 13 Properties of Good Source Codes 357
13.1 Optimal and Asymptotically Optimal Codes 357
13.2 Block Codes 359
Moment Properties 364
13.3 Sliding-Block Codes 365
Asymptotically Optimal Sliding-Block Codes 372
Process approximation 372
Moment conditions 374
Finite-order distribution Shannon conditions for IID processes 376
Asymptotic Uncorrelation 378
Chapter 14 Coding for Noisy Channels 380
14.1 Noisy Channels 380
14.2 Feinstein’s Lemma 382
14.3 Feinstein’s Theorem 385
14.4 Channel Capacity 388
14.5 Robust Block Codes 393
14.6 Block Coding Theorems for Noisy Channels 396
14.7 Joint Source and Channel Block Codes 398
14.8 Synchronizing Block Channel Codes 401
14.9 Sliding-block Source and Channel Coding 405
Totally Ergodic Sources 405
Ergodic Sources 412
References 416
Index 425

Erscheint lt. Verlag 27.1.2011
Zusatzinfo XXVII, 409 p.
Verlagsort New York
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Theorie / Studium
Mathematik / Informatik Mathematik Statistik
Naturwissenschaften
Technik Elektrotechnik / Energietechnik
Technik Nachrichtentechnik
Schlagworte compression • Entropy • ergodic theory • Quantization • rate-distortion theory • Shannon information theory • sliding-block coding theory • source coding
ISBN-10 1-4419-7970-0 / 1441979700
ISBN-13 978-1-4419-7970-4 / 9781441979704
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 4,0 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Discover tactics to decrease churn and expand revenue

von Jeff Mar; Peter Armaly

eBook Download (2024)
Packt Publishing (Verlag)
CHF 24,60