Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Tree-Based Convolutional Neural Networks - Lili Mou, Zhi Jin

Tree-Based Convolutional Neural Networks (eBook)

Principles and Applications

, (Autoren)

eBook Download: PDF
2018 | 1st ed. 2018
XV, 96 Seiten
Springer Singapore (Verlag)
978-981-13-1870-2 (ISBN)
Systemvoraussetzungen
58,84 inkl. MwSt
(CHF 57,45)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

This book proposes a novel neural architecture, tree-based convolutional neural networks (TBCNNs),for processing tree-structured data. TBCNNsare related to existing convolutional neural networks (CNNs) and recursive neural networks (RNNs), but they combine the merits of both: thanks to their short propagation path, they are as efficient in learning as CNNs; yet they are also as structure-sensitive as RNNs. 

In this book, readers will also find a comprehensive literature review of related work, detailed descriptions of TBCNNs and their variants, and experiments applied to program analysis and natural language processing tasks. It is also an enjoyable read for all those with a general interest in deep learning.




Lili Mou is currently a research scientist at AdeptMind Research. He received his BS and PhD degrees from the School of EECS, Peking University, in 2012 and 2017, respectively. After that, Lili worked as a postdoctoral fellow at the University of Waterloo. His current research interests include deep learning applied to natural language processing, and programming language processing. His work has been published at leading conferences and in respected journals, like AAAI, ACL, CIKM, COLING, EMNLP, ICML, IJCAI, INTERSPEECH, LREC, and TACL. He has been a primary reviewer/PC member for top venues including AAAI, ACL, COLING, IJCNLP, and NAACL-HLT. Lili received the 'Outstanding PhD Thesis Reward' from Peking University and the 'Top-10 Student Scholars Prize' from the School of EECS, Peking University for his research achievements.

Zhi Jin is a professor of Computer Science at Peking University. In addition, she is deputy director of the Key Laboratory of High Confidence Software Technologies (Ministry of Education) at Peking University and Director of the CCF Technical Committee of Software Engineering. Her research work is primarily concerned with knowledge engineering and requirements engineering, focusing on knowledge/requirements elicitation, conceptual modeling and analysis. Recently, has begun focusing more on modeling adaptive software systems. She is/was the principal investigator of over 10 national competitive grants including the chief scientist of a national basic research project (973 project) for the Ministry of Science and Technology of China and the project leader of three key projects for the National Science Foundation of China. She was the General Chair of RE2016, Program Co-Chair of COMPSAC2011, General Co-Chair and Program Co-Chair of KSEM2010 and KSEM2009. She is executive editor-in-chief of theChinese Journal of Software, and serves on the Editorial Board of REJ and IJSEKE. She was an Outstanding Youth Fund Winner of the National Science Foundation of China in 2006 and Distinguished Young Scholars of Chinese Academy of Sciences in 2001. She received the Zhong Chuang Software Talent Award in 1998 and the First Prize in Science and Technology Outstanding Achievement: Science and Technology Progress Award (Ministry of Education, China) in 2013. She is the co-author/author of three books and more than 120 journal and conference publications.

 


This book proposes a novel neural architecture, tree-based convolutional neural networks (TBCNNs),for processing tree-structured data. TBCNNsare related to existing convolutional neural networks (CNNs) and recursive neural networks (RNNs), but they combine the merits of both: thanks to their short propagation path, they are as efficient in learning as CNNs; yet they are also as structure-sensitive as RNNs.  In this book, readers will also find a comprehensive literature review of related work, detailed descriptions of TBCNNs and their variants, and experiments applied to program analysis and natural language processing tasks. It is also an enjoyable read for all those with a general interest in deep learning.

Lili Mou is currently a research scientist at AdeptMind Research. He received his BS and PhD degrees from the School of EECS, Peking University, in 2012 and 2017, respectively. After that, Lili worked as a postdoctoral fellow at the University of Waterloo. His current research interests include deep learning applied to natural language processing, and programming language processing. His work has been published at leading conferences and in respected journals, like AAAI, ACL, CIKM, COLING, EMNLP, ICML, IJCAI, INTERSPEECH, LREC, and TACL. He has been a primary reviewer/PC member for top venues including AAAI, ACL, COLING, IJCNLP, and NAACL-HLT. Lili received the “Outstanding PhD Thesis Reward” from Peking University and the “Top-10 Student Scholars Prize” from the School of EECS, Peking University for his research achievements.Zhi Jin is a professor of Computer Science at Peking University. In addition, she is deputy director of the Key Laboratory of High Confidence Software Technologies (Ministry of Education) at Peking University and Director of the CCF Technical Committee of Software Engineering. Her research work is primarily concerned with knowledge engineering and requirements engineering, focusing on knowledge/requirements elicitation, conceptual modeling and analysis. Recently, has begun focusing more on modeling adaptive software systems. She is/was the principal investigator of over 10 national competitive grants including the chief scientist of a national basic research project (973 project) for the Ministry of Science and Technology of China and the project leader of three key projects for the National Science Foundation of China. She was the General Chair of RE2016, Program Co-Chair of COMPSAC2011, General Co-Chair and Program Co-Chair of KSEM2010 and KSEM2009. She is executive editor-in-chief of theChinese Journal of Software, and serves on the Editorial Board of REJ and IJSEKE. She was an Outstanding Youth Fund Winner of the National Science Foundation of China in 2006 and Distinguished Young Scholars of Chinese Academy of Sciences in 2001. She received the Zhong Chuang Software Talent Award in 1998 and the First Prize in Science and Technology Outstanding Achievement: Science and Technology Progress Award (Ministry of Education, China) in 2013. She is the co-author/author of three books and more than 120 journal and conference publications.  

1         Introduction

1.1           Deep Learning Background

1.2           Structure-Sensitive Neural Networks

1.3           The Proposed Tree-Based Convolutional Neural Networks

1.4           Overview of the Book

2         Preliminaries and Related Work

2.1           General Neural Networks

2.1.1      Neurons and Multi-Layer Perceptrons

2.1.2      Training of Neural Networks: Backpropagations

2.1.3      Pros and Cons of Multi-Layer Perceptrons

2.1.4      Pretraining of Neural Networks

2.2           Neural Networks Applied in Natural Language Processing

2.2.1      The Characteristics of Natural Language

2.2.2      Language Models

2.2.3      Word Embeddings

2.3           Existing Structure-Sensitive Neural Networks

2.3.1      Convolutional Neural Networks

2.3.2      Recurrent Neural Networks

2.3.3      Recursive Neural Networks

2.4           Summary and Discussions

3         General Concepts of Tree-Based Convolutional Neural Networks (TBCNNs)

3.1           Idea and Formulation

3.2           Applications of TBCNNs

3.3           Issues in designing TBCNNs

4         TBCNN for Programs’ Abstract Syntax Trees (ASTs)

4.1           Background of Program Analysis

4.2           Proposed Model

4.2.1      Overview

4.2.2      Representation Learning of AST nodes

4.2.3      Encoding Layer

4.2.4      AST-Based Convolutional Layer

4.2.5      Dynamic Pooling

4.2.6      Continuous Binary Tree

4.3           Experiments

4.3.1      Unsupervised Representation Learning

4.3.2      Program Classification

4.3.3      Detecting Bubble Sort

4.3.4      Model Analysis

4.4           Summary and Discussions

5         TBCNN for Constituency Trees in Natural Language Processing

5.1           Background of Sentence Modeling and Constituency Trees

5.2           Proposed Model

5.2.1      Constituency Trees as Input

5.2.2      Recursively Representing Intermediate Layers

5.2.3      Constituency Tree-Based Convolutional Layer

5.2.4      Dynamic Pooling Layer

5.3           Experiments

5.3.1      Sentiment Analysis

5.3.2      Question Classification

5.4           Summary and Discussions

6         TBCNN for Dependency Trees in Natural Language Processing

6.1           Background of Dependency Trees

6.2           Proposed Model

6.2.1      Dependency Trees as Input

6.2.2      Dependency Tree-Based Convolutional Layer

6.2.3      Dynamic Pooling Layer

6.2.4      Dependency TBCNN Applied to Sentence Matching

6.3           Experiments

6.3.1      Sentence Classification

6.3.2      Sentence Matching

6.3.3      Model Analysis

6.3.4      Visualization

6.4           Summary and Discussions

7         Concluding Remarks

7.1           More Structure-Sensitive Neural Models

7.2           Conclusion

Erscheint lt. Verlag 1.10.2018
Reihe/Serie SpringerBriefs in Computer Science
SpringerBriefs in Computer Science
Zusatzinfo XV, 96 p. 32 illus.
Verlagsort Singapore
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Datenbanken
Mathematik / Informatik Informatik Software Entwicklung
Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Schlagworte Deep learning • Natural Language Processing • Neural networks • program analysis • Tree-Based Convolution
ISBN-10 981-13-1870-0 / 9811318700
ISBN-13 978-981-13-1870-2 / 9789811318702
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 2,8 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
der Praxis-Guide für Künstliche Intelligenz in Unternehmen - Chancen …

von Thomas R. Köhler; Julia Finkeissen

eBook Download (2024)
Campus Verlag
CHF 37,95
Wie du KI richtig nutzt - schreiben, recherchieren, Bilder erstellen, …

von Rainer Hattenhauer

eBook Download (2023)
Rheinwerk Computing (Verlag)
CHF 16,95