libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

anthony martin; bartlett peter l. - neural network learning

Neural Network Learning Theoretical Foundations

;




Disponibilità: Normalmente disponibile in 20 giorni
A causa di problematiche nell'approvvigionamento legate alla Brexit sono possibili ritardi nelle consegne.


PREZZO
148,98 €
NICEPRICE
141,53 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Pubblicazione: 11/1999





Trama

This book describes recent theoretical advances in the study of artificial neural networks.




Note Editore

This book describes theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Research on pattern classification with binary-output networks is surveyed, including a discussion of the relevance of the Vapnik–Chervonenkis dimension, and calculating estimates of the dimension for several neural network models. A model of classification by real-output networks is developed, and the usefulness of classification with a 'large margin' is demonstrated. The authors explain the role of scale-sensitive versions of the Vapnik–Chervonenkis dimension in large margin classification, and in real prediction. They also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient constructive learning algorithms. The book is self-contained and is intended to be accessible to researchers and graduate students in computer science, engineering, and mathematics.




Sommario

1. Introduction; Part I. Pattern Recognition with Binary-output Neural Networks: 2. The pattern recognition problem; 3. The growth function and VC-dimension; 4. General upper bounds on sample complexity; 5. General lower bounds; 6. The VC-dimension of linear threshold networks; 7. Bounding the VC-dimension using geometric techniques; 8. VC-dimension bounds for neural networks; Part II. Pattern Recognition with Real-output Neural Networks: 9. Classification with real values; 10. Covering numbers and uniform convergence; 11. The pseudo-dimension and fat-shattering dimension; 12. Bounding covering numbers with dimensions; 13. The sample complexity of classification learning; 14. The dimensions of neural networks; 15. Model selection; Part III. Learning Real-Valued Functions: 16. Learning classes of real functions; 17. Uniform convergence results for real function classes; 18. Bounding covering numbers; 19. The sample complexity of learning function classes; 20. Convex classes; 21. Other learning problems; Part IV. Algorithmics: 22. Efficient learning; 23. Learning as optimisation; 24. The Boolean perceptron; 25. Hardness results for feed-forward networks; 26. Constructive learning algorithms for two-layered networks.




Prefazione

This book describes theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. It is intended to be accessible to researchers and graduate students in computer science, engineering, and mathematics.










Altre Informazioni

ISBN:

9780521573535

Condizione: Nuovo
Dimensioni: 229 x 27 x 152 mm Ø 760 gr
Formato: Copertina rigida
Illustration Notes:29 line figures
Pagine Arabe: 404


Dicono di noi