libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

mou lili; jin zhi - tree-based convolutional neural networks

Tree-Based Convolutional Neural Networks Principles and Applications

;




Disponibilità: Normalmente disponibile in 15 giorni


PREZZO
59,98 €
NICEPRICE
56,98 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Editore:

Springer

Pubblicazione: 10/2018
Edizione: 1st ed. 2018





Trama

This book proposes a novel neural architecture, tree-based convolutional neural networks (TBCNNs),for processing tree-structured data. TBCNNsare related to existing convolutional neural networks (CNNs) and recursive neural networks (RNNs), but they combine the merits of both: thanks to their short propagation path, they are as efficient in learning as CNNs; yet they are also as structure-sensitive as RNNs. 

In this book, readers will also find a comprehensive literature review of related work, detailed descriptions of TBCNNs and their variants, and experiments applied to program analysis and natural language processing tasks. It is also an enjoyable read for all those with a general interest in deep learning.






Sommario

1         Introduction

1.1           Deep Learning Background

1.2           Structure-Sensitive Neural Networks

1.3           The Proposed Tree-Based Convolutional Neural Networks

1.4           Overview of the Book

2         Preliminaries and Related Work

2.1           General Neural Networks

2.1.1      Neurons and Multi-Layer Perceptrons

2.1.2      Training of Neural Networks: Backpropagations

2.1.3      Pros and Cons of Multi-Layer Perceptrons

2.1.4      Pretraining of Neural Networks

2.2           Neural Networks Applied in Natural Language Processing

2.2.1      The Characteristics of Natural Language

2.2.2      Language Models

2.2.3      Word Embeddings

2.3           Existing Structure-Sensitive Neural Networks

2.3.1      Convolutional Neural Networks

2.3.2      Recurrent Neural Networks

2.3.3      Recursive Neural Networks

2.4           Summary and Discussions

3         General Concepts of Tree-Based Convolutional Neural Networks (TBCNNs)

3.1           Idea and Formulation

3.2           Applications of TBCNNs

3.3           Issues in designing TBCNNs

4         TBCNN for Programs’ Abstract Syntax Trees (ASTs)

4.1           Background of Program Analysis

4.2           Proposed Model

4.2.1      Overview

4.2.2      Representation Learning of AST nodes

4.2.3      Encoding Layer

4.2.4      AST-Based Convolutional Layer

4.2.5      Dynamic Pooling

4.2.6      Continuous Binary Tree

4.3           Experiments

4.3.1      Unsupervised Representation Learning

4.3.2      Program Classification

4.3.3      Detecting Bubble Sort

4.3.4      Model Analysis

4.4           Summary and Discussions

5         TBCNN for Constituency Trees in Natural Language Processing

5.1           Background of Sentence Modeling and Constituency Trees

5.2           Proposed Model

5.2.1      Constituency Trees as Input

5.2.2      Recursively Representing Intermediate Layers

5.2.3      Constituency Tree-Based Convolutional Layer

5.2.4      Dynamic Pooling Layer

5.3           Experiments

5.3.1      Sentiment Analysis

5.3.2      Question Classification

5.4           Summary and Discussions

6         TBCNN for Dependency Trees in Natural Language Processing

6.1           Background of Dependency Trees

6.2           Proposed Model

6.2.1      Dependency Trees as Input

6.2.2      Dependency Tree-Based Convolutional Layer

6.2.3      Dynamic Pooling Layer

6.2.4      Dependency TBCNN Applied to Sentence Matching

6.3           Experiments

6.3.1      Sentence Classification

6.3.2      Sentence Matching

6.3.3      Model Analysis

6.3.4      Visualization

6.4           Summary and Discussions

7         Concluding Remarks

7.1           More Structure-Sensitive Neural Models

7.2           Conclusion





Autore

Lili Mou is currently a research scientist at AdeptMind Research. He received his BS and PhD degrees from the School of EECS, Peking University, in 2012 and 2017, respectively. After that, Lili worked as a postdoctoral fellow at the University of Waterloo. His current research interests include deep learning applied to natural language processing, and programming language processing. His work has been published at leading conferences and in respected journals, like AAAI, ACL, CIKM, COLING, EMNLP, ICML, IJCAI, INTERSPEECH, LREC, and TACL. He has been a primary reviewer/PC member for top venues including AAAI, ACL, COLING, IJCNLP, and NAACL-HLT. Lili received the “Outstanding PhD Thesis Reward” from Peking University and the “Top-10 Student Scholars Prize” from the School of EECS, Peking University for his research achievements.

Zhi Jin is a professor of Computer Science at Peking University. In addition, she is deputy director of the Key Laboratory of High Confidence Software Technologies (Ministry of Education) at Peking University and Director of the CCF Technical Committee of Software Engineering. Her research work is primarily concerned with knowledge engineering and requirements engineering, focusing on knowledge/requirements elicitation, conceptual modeling and analysis. Recently, has begun focusing more on modeling adaptive software systems. She is/was the principal investigator of over 10 national competitive grants including the chief scientist of a national basic research project (973 project) for the Ministry of Science and Technology of China and the project leader of three key projects for the National Science Foundation of China. She was the General Chair of RE2016, Program Co-Chair of COMPSAC2011, General Co-Chair and Program Co-Chair of KSEM2010 and KSEM2009. She is executive editor-in-chief of theChinese Journal of Software, and serves on the Editorial Board of REJ and IJSEKE. She was an Outstanding Youth Fund Winner of the National Science Foundation of China in 2006 and Distinguished Young Scholars of Chinese Academy of Sciences in 2001. She received the Zhong Chuang Software Talent Award in 1998 and the First Prize in Science and Technology Outstanding Achievement: Science and Technology Progress Award (Ministry of Education, China) in 2013. She is the co-author/author of three books and more than 120 journal and conference publications.

 











Altre Informazioni

ISBN:

9789811318696

Condizione: Nuovo
Collana: SpringerBriefs in Computer Science
Dimensioni: 235 x 155 mm Ø 454 gr
Formato: Brossura
Illustration Notes:XV, 96 p. 32 illus.
Pagine Arabe: 96
Pagine Romane: xv


Dicono di noi