Deep Learning for Natural Language Processing
Cambridge University Press (Verlag)
978-1-009-01265-2 (ISBN)
Deep Learning is becoming increasingly important in a technology-dominated world. However, the building of computational models that accurately represent linguistic structures is complex, as it involves an in-depth knowledge of neural networks, and the understanding of advanced mathematical concepts such as calculus and statistics. This book makes these complexities accessible to those from a humanities and social sciences background, by providing a clear introduction to deep learning for natural language processing. It covers both theoretical and practical aspects, and assumes minimal knowledge of machine learning, explaining the theory behind natural language in an easy-to-read way. It includes pseudo code for the simpler algorithms discussed, and actual Python code for the more complicated architectures, using modern deep learning libraries such as PyTorch and Hugging Face. Providing the necessary theoretical foundation and practical tools, this book will enable readers to immediately begin building real-world, practical natural language processing systems.
Mihai Surdeanu is Associate Professor in the Computer Science Department at the University of Arizona. He works in both academia and industry on NLP systems that process and extract meaning from natural language. Marco Antonio Valenzuela-Escárcega is a Research Scientist in the Computer Science department at the University of Arizona. He has worked on natural language processing projects in both industry and academia.
Preface; 1. Introduction; 2. The perception; 3. Logistic regression; 4. Implementing text classfication using perceptron and LR; 5. Feed forward neural networks; 6. Best practices in deep learning; 7. Implementing text classification with feed forward networks; 8. Distributional hypothesis and representation learning; 9. Implementing text classification using word embedding; 10. Recurrent neural networks; 11. Implementing POS tagging using RNNs; 12. Contexualized embeddings and transformer networks; 13. Using transformers with the hugging face library; 14. Encoder-decoder methods; 15. Implementing encoder-decoder methods; 16. Neural architecture for NLP applications; Appendix A: Overview of the python language and the key libraries; Appendix B: Character endcodings: ASCII and unicode.
Erscheinungsdatum | 22.09.2021 |
---|---|
Zusatzinfo | Worked examples or Exercises |
Verlagsort | Cambridge |
Sprache | englisch |
Gewicht | 498 g |
Themenwelt | Geisteswissenschaften ► Sprach- / Literaturwissenschaft ► Sprachwissenschaft |
ISBN-10 | 1-009-01265-7 / 1009012657 |
ISBN-13 | 978-1-009-01265-2 / 9781009012652 |
Zustand | Neuware |
Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich