Channel Coding in Communication Networks
ISTE Ltd and John Wiley & Sons Inc (Verlag)
978-1-905209-24-8 (ISBN)
- Titel z.Zt. nicht lieferbar
- Versandkostenfrei
- Auch auf Rechnung
- Artikel merken
As such, anyone involved in the areas of channel coding and error correcting coding will find this book to be of invaluable assistance.
Alain Glavieux, was an Associate Professor at the Ecole Nationale Supérieure des Télécommunications de Bretagne (ENST Bretagne), where he lectured in signal processing and communication theory and set up a digital communications research group. He was appointed Professor in 1989, and became Head of the Signals and Communications Department in 1994. In 1998, he took charge of ENST Bretagne's Corporate Relations with Industry and, in 2002, became Head of the CNRS laboratory Algorithmic and Hardware Processing of Information, Communications and Knowledge. In 2003, he was appointed assistant director of ENST Bretagne. Together with Professor Claude Berrou, Alain Glavieux developed a new family of error correction codes called turbo codes. He also pioneered the principle of turbo equalization. He received an IEEE Golden Jubilee Award for Technological Innovation in 1998, the IEEE Richard W. Hamming Medal in 2003, and the French Academy of Sciences Grand Prix France Telecom in 2003. He passed away after a long illness on September 25th, 2004 at the age of 55.
Homage to Alain Glavieux xv
Chapter 1. Information Theory 1
Gérard BATTAIL
1.1. Introduction: the Shannon paradigm 1
1.2. Principal coding functions 5
1.2.1. Source coding 5
1.2.2. Channel coding 6
1.2.3. Cryptography 7
1.2.4. Standardization of the Shannon diagram blocks 8
1.2.5. Fundamental theorems 9
1.3. Quantitative measurement of information 9
1.3.1. Principle 9
1.3.2. Measurement of self-information 10
1.3.3. Entropy of a source 11
1.3.4. Mutual information measure 12
1.3.5. Channel capacity 14
1.3.6. Comments on the measurement of information 15
1.4. Source coding 15
1.4.1. Introduction 15
1.4.2. Decodability, Kraft-McMillan inequality 16
1.4.3. Demonstration of the fundamental theorem 17
1.4.4. Outline of optimal algorithms of source coding 18
1.5. Channel coding 19
1.5.1. Introduction and statement of the fundamental theorem 19
1.5.2. General comments 20
1.5.3. Need for redundancy 20
1.5.4. Example of the binary symmetric channel 21
1.5.5. A geometrical interpretation 25
1.5.6. Fundamental theorem: Gallager’s proof 26
1.6. Channels with continuous noise 32
1.6.1. Introduction 32
1.6.2. A reference model in physical reality: the channel with Gaussian additive noise 32
1.6.3. Communication via a channel with additive white Gaussian noise 35
1.6.4. Channel with fadings 37
1.7. Information theory and channel coding 38
1.8. Bibliography 40
Chapter 2. Block Codes 41
Alain POLI
2.1. Unstructured codes 41
2.1.1. The fundamental question of message redundancy 41
2.1.2. Unstructured codes 42
2.2. Linear codes 44
2.2.1. Introduction 44
2.2.2. Properties of linear codes 44
2.2.3. Dual code 46
2.2.4. Some linear codes 50
2.2.5. Decoding of linear codes 51
2.3. Finite fields 53
2.3.1. Basic concepts 53
2.3.2. Polynomial modulo calculations: quotient ring 53
2.3.3. Irreducible polynomial modulo calculations: finite field 54
2.3.4. Order and the opposite of an element of F2[X]/(p(X)) 54
2.3.5. Minimum polynomials 59
2.3.6. The field of nth roots of unity 60
2.3.7. Projective geometry in a finite field 61
2.4. Cyclic codes 62
2.4.1. Introduction 62
2.4.2. Base, coding, dual code and code annihilator 63
2.4.3. Certain cyclic codes 68
2.4.4. Existence and construction of cyclic codes 74
2.4.5. Applications of cyclic codes 82
2.5. Electronic circuits 82
2.5.1. Basic gates for error correcting codes 82
2.5.2. Shift registers 83
2.5.3. Circuits for the correct codes 83
2.5.4. Polynomial representation and representation to the power of a primitive representation for a field 87
2.6. Decoding of cyclic codes 88
2.6.1. Meggitt decoding (trapping of bursts) 88
2.6.2. Decoding by the DFT 89
2.6.3. FG-decoding 94
2.6.4. Berlekamp-Massey decoding 99
2.6.5. Majority decoding 105
2.6.6. Hard decoding, soft decoding and chase decoding 110
2.7. 2D codes 111
2.7.1. Introduction 111
2.7.2. Product codes 112
2.7.3. Minimum distance of 2D codes 112
2.7.4. Practical examples of the use of 2D codes 112
2.7.5. Coding 112
2.7.6. Decoding 113
2.8. Exercises on block codes 113
2.8.1. Unstructured codes 113
2.8.2. Linear codes 114
2.8.3. Finite bodies 117
2.8.4. Cyclic codes 119
2.8.5. Exercises on circuits 123
Chapter 3. Convolutional Codes 129
Alain GLAVIEUX and Sandrine VATON
3.1. Introduction 129
3.2. State transition diagram, trellis, tree 135
3.3. Transfer function and distance spectrum 137
3.4. Perforated convolutional codes 140
3.5. Catastrophic codes 142
3.6. The decoding of convolutional codes 142
3.6.1. Viterbi algorithm 143
3.6.2. MAP criterion or BCJR algorithm 156
3.6.3. SubMAP algorithm 169
3.7. Performance of convolutional codes 172
3.7.1. Channel with binary input and continuous output 173
3.7.2. Channel with binary input and output 180
3.8. Distance spectrum of convolutional codes 182
3.9. Recursive convolution codes 184
Chapter 4. Coded Modulations 197
Ezio BIGLIERI
4.1. Hamming distance and Euclidean distance 197
4.2. Trellis code 200
4.3. Decoding 201
4.4. Some examples of TCM 201
4.5. Choice of a TCM diagram 205
4.6. TCM representations 207
4.7. TCM transparent to rotations 209
4.7.1. Partitions transparent to rotations 211
4.7.2. Transparent trellis with rotations 212
4.7.3. Transparent encoder 213
4.7.4. General considerations 215
4.8. TCM error probability 215
4.8.1. Upper limit of the probability of an error event 215
4.8.2. Examples 226
4.8.3. Calculation of áfree 228
4.9. Power spectral density 232
4.10. Multi-level coding 234
4.10.1. Block coded modulation 235
4.10.2. Decoding of multilevel codes by stages 237
4.11. Probability of error for the BCM 238
4.11.1. Additive Gaussian channel 239
4.11.2. Calculation of the transfer function 240
4.12. Coded modulations for channels with fading 241
4.12.1. Modeling of channels with fading 241
4.12.2. Rayleigh fading channel: Euclidean distance and Hamming distance 247
4.13. Bit interleaved coded modulation (BICM) 251
4.14. Bibliography 253
Chapter 5. Turbocodes 255
Claude BERROU, Catherine DOUILLARD, Michel JÉZÉQUEL and Annie PICART
5.1. History of turbocodes 255
5.1.1. Concatenation 256
5.1.2. Negative feedback in the decoder 256
5.1.3. Recursive systematic codes 258
5.1.4. Extrinsic information 258
5.1.5. Parallel concatenation 259
5.1.6. Irregular interleaving 260
5.2. A simple and convincing illustration of the turbo effect 260
5.3. Turbocodes 265
5.3.1. Coding 265
5.3.2. The termination of constituent codes 272
5.3.3. Decoding 275
5.3.4. SISO decoding and extrinsic information 280
5.4. The permutation function 287
5.4.1. The regular permutation 288
5.4.2. Statistical approach 290
5.4.3. Real permutations 291
5.5. m-binary turbocodes 297
5.5.1. m-binary RSC encoders 298
5.5.2. m-binary turbocodes 300
5.5.3. Double-binary turbocodes with 8 states 302
5.5.4. Double-binary turbocodes with 16 states 303
5.6. Bibliography 304
Chapter 6. Block Turbocodes 307
Ramesh PYNDIAH and Patrick ADDE
6.1. Introduction 307
6.2. Concatenation of block codes 308
6.2.1. Parallel concatenation of block codes 309
6.2.2. Serial concatenation of block codes 313
6.2.3. Properties of product codes and theoretical performances 318
6.3. Soft decoding of block codes 323
6.3.1. Soft decoding of block codes 324
6.3.2. Soft decoding of block codes (Chase algorithm) 326
6.3.3. Decoding of block codes by the Viterbi algorithm 334
6.3.4. Decoding of block codes by the Hartmann and Rudolph algorithm 338
6.4. Iterative decoding of product codes 340
6.4.1. SISO decoding of a block code 341
6.4.2. Implementation of the weighting algorithm 345
6.4.3. Iterative decoding of product codes 347
6.4.4. Comparison of the performances of BTC 349
6.5. Conclusion 367
6.6. Bibliography 367
Chapter 7. Block Turbocodes in a Practical Setting 373
Patrick ADDE and Ramesh PYNDIAH
7.1. Introduction 373
7.2. Implementation of BTC: structure and complexity 373
7.2.1. Influence of integration constraints 373
7.2.2. General architecture and organization of the circuit 376
7.2.3. Memorizing of data and results 380
7.2.4. Elementary decoder 384
7.2.5. High flow structure 392
7.3. Flexibility of turbo block codes 397
7.4. Hybrid turbocodes 404
7.4.1. Construction of the code 404
7.4.2. Binary error rates (BER) function of the signal-to-noise ratio in a Gaussian channel 406
7.4.3. Variation of the size of the blocks 408
7.4.4. Variation of the total rate 409
7.5. Multidimensional turbocodes 409
7.6. Bibliography 412
List of Authors 415
Index 417
Erscheint lt. Verlag | 14.1.2007 |
---|---|
Verlagsort | London |
Sprache | englisch |
Maße | 158 x 234 mm |
Gewicht | 771 g |
Themenwelt | Mathematik / Informatik ► Informatik ► Netzwerke |
Informatik ► Theorie / Studium ► Kryptologie | |
Technik ► Nachrichtentechnik | |
ISBN-10 | 1-905209-24-X / 190520924X |
ISBN-13 | 978-1-905209-24-8 / 9781905209248 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich