[No longer publicly accessible] Learning Deep Architectures for AI

[No longer publicly accessible] Learning Deep Architectures for AI

This monograph discusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models.

Publication date: 31 Dec 2009

ISBN-10: n/a

ISBN-13: n/a

Paperback: 130 pages

Views: 10,787

Type: Paper

Publisher: n/a

License: n/a

Post time: 30 Nov 2016 09:00:00

[No longer publicly accessible] Learning Deep Architectures for AI

[No longer publicly accessible] Learning Deep Architectures for AI This monograph discusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models.
Tag(s): Machine Learning
Publication date: 31 Dec 2009
ISBN-10: n/a
ISBN-13: n/a
Paperback: 130 pages
Views: 10,787
Document Type: Paper
Publisher: n/a
License: n/a
Post time: 30 Nov 2016 09:00:00
From the Abstract:
Yoshua Bengio wrote:Theoretical results suggest that in order to learn the kind of complicated functions that can represent high-level abstractions (e.g., in vision, language, and other AI-level tasks), one may need deep architectures. Deep architectures are composed of multiple levels of non-linear operations, such as in neural nets with many hidden layers or in complicated propositional formulae re-using many sub-formulae. Searching the parameter space of deep architectures is a difficult task, but learning algorithms such as those for Deep Belief Networks have recently been proposed to tackle this problem with notable success, beating the stateof-the-art in certain areas. This monograph discusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models such as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks.

Update: 
The book is no longer publicly accessible through the author's webpage nor the webpage for The Department of Computer Science and Operations Research, The Université de Montréal.




About The Author(s)


Yoshua Bengio is Full Professor of the Department of Computer Science and Operations Research, head of the Machine Learning Laboratory (MILA), CIFAR Program co-director of the CIFAR Neural Computation and Adaptive Perception program,  Canada Research Chair in Statistical Learning Algorithms, and he also holds the NSERC-Ubisoft industrial chair. His main research ambition is to understand principles of learning that yield intelligence. 

Yoshua Bengio

Yoshua Bengio is Full Professor of the Department of Computer Science and Operations Research, head of the Machine Learning Laboratory (MILA), CIFAR Program co-director of the CIFAR Neural Computation and Adaptive Perception program,  Canada Research Chair in Statistical Learning Algorithms, and he also holds the NSERC-Ubisoft industrial chair. His main research ambition is to understand principles of learning that yield intelligence. 


Book Categories
Sponsors