Computational Complexity: A Quantitative Perspective
Seiten
2004
Elsevier Science Ltd (Verlag)
978-0-444-82841-5 (ISBN)
Elsevier Science Ltd (Verlag)
978-0-444-82841-5 (ISBN)
- Titel ist leider vergriffen;
keine Neuauflage - Artikel merken
There has been a common perception that computational complexity is a theory of 'bad news' because its typical results assert that various real-world and innocent-looking tasks are infeasible. This book takes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems.
There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively.The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length.One chapter is dedicated to abstract complexity theory, an older field which, however, deserves attention because it lays out the foundations of complexity. The other chapters, on the other hand, focus on recent and important developments in complexity. The book presents in a fairly detailed manner concepts that have been at the centre of the main research lines in complexity in the last decade or so, such as: average-complexity, quantum computation, hardness amplification, resource-bounded measure, the relation between one-way functions and pseudo-random generators, the relation between hard predicates and pseudo-random generators, extractors, derandomization of bounded-error probabilistic algorithms, probabilistically checkable proofs, non-approximability of optimization problems, and others.The book should appeal to graduate computer science students, and to researchers who have an interest in computer science theory and need a good understanding of computational complexity, e.g., researchers in algorithms, AI, logic, and other disciplines.
There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively.The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length.One chapter is dedicated to abstract complexity theory, an older field which, however, deserves attention because it lays out the foundations of complexity. The other chapters, on the other hand, focus on recent and important developments in complexity. The book presents in a fairly detailed manner concepts that have been at the centre of the main research lines in complexity in the last decade or so, such as: average-complexity, quantum computation, hardness amplification, resource-bounded measure, the relation between one-way functions and pseudo-random generators, the relation between hard predicates and pseudo-random generators, extractors, derandomization of bounded-error probabilistic algorithms, probabilistically checkable proofs, non-approximability of optimization problems, and others.The book should appeal to graduate computer science students, and to researchers who have an interest in computer science theory and need a good understanding of computational complexity, e.g., researchers in algorithms, AI, logic, and other disciplines.
Contents Preface.1. Preliminaries.2. Abstract complexity theory.3. P, NP, and E.4. Quantum computation.5. One-way functions, pseudo-random generators.6. Optimization problems.A. Tail bounds.Bibliography.Index.
Erscheint lt. Verlag | 7.7.2004 |
---|---|
Reihe/Serie | North-Holland Mathematics Studies |
Verlagsort | Oxford |
Sprache | englisch |
Maße | 165 x 240 mm |
Gewicht | 680 g |
Themenwelt | Mathematik / Informatik ► Informatik ► Theorie / Studium |
ISBN-10 | 0-444-82841-9 / 0444828419 |
ISBN-13 | 978-0-444-82841-5 / 9780444828415 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
was jeder über Informatik wissen sollte
Buch | Softcover (2024)
Springer Vieweg (Verlag)
CHF 53,15
Grundlagen – Anwendungen – Perspektiven
Buch | Softcover (2022)
Springer Vieweg (Verlag)
CHF 48,95
Eine Einführung in die Systemtheorie
Buch | Softcover (2022)
UTB (Verlag)
CHF 34,95