Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Curriculum learning

Curriculum learning Curriculum Learning Yoshua Bengio1 Yoshua.Bengio@umontreal.ca J´rˆme Louradour1,2 e o jeromelouradour@gmail.com Ronan Collobert3 ronan@collobert.com Jason Weston3 jasonw@nec-labs.com (1) U. Montreal, P.O. Box 6128, Montreal, Canada (2) A2iA SA, 40bis Fabert, Paris, France (3) NEC Laboratories America, 4 Independence Way, Princeton, NJ, USA Abstract Humans and animals learn much better when the examples are not randomly presented but organized in a meaningful order which illustrates gradually more concepts, and gradually more complex ones. Here, we formalize such training strategies in the context of machine learning, and call them œcurriculum learning . In the context of recent research studying the di ƒculty of training in the presence of non-convex training criteria (for deep deterministic and stochastic neural networks), we explore curriculum learning in various set-ups. The experiments show that signi cant improvements in generalization can be achieved. We hypothesize that curriculum learning has both an e €ect on the speed of convergence of the training process to a minimum and, in the case of non-convex criteria, on the quality of the local minima obtained: curriculum learning can be seen as a particular form of continuation method (a general strategy for global optimization of non-convex functions). training and remarkably increase the speed http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Curriculum learning

Association for Computing Machinery — Jun 14, 2009

Loading next page...
/lp/association-for-computing-machinery/curriculum-learning-1Bi5YD2QKX

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Datasource
Association for Computing Machinery
Copyright
Copyright © 2009 by ACM Inc.
ISBN
978-1-60558-516-1
doi
10.1145/1553374.1553380
Publisher site
See Article on Publisher Site

Abstract

Curriculum Learning Yoshua Bengio1 Yoshua.Bengio@umontreal.ca J´rˆme Louradour1,2 e o jeromelouradour@gmail.com Ronan Collobert3 ronan@collobert.com Jason Weston3 jasonw@nec-labs.com (1) U. Montreal, P.O. Box 6128, Montreal, Canada (2) A2iA SA, 40bis Fabert, Paris, France (3) NEC Laboratories America, 4 Independence Way, Princeton, NJ, USA Abstract Humans and animals learn much better when the examples are not randomly presented but organized in a meaningful order which illustrates gradually more concepts, and gradually more complex ones. Here, we formalize such training strategies in the context of machine learning, and call them œcurriculum learning . In the context of recent research studying the di ƒculty of training in the presence of non-convex training criteria (for deep deterministic and stochastic neural networks), we explore curriculum learning in various set-ups. The experiments show that signi cant improvements in generalization can be achieved. We hypothesize that curriculum learning has both an e €ect on the speed of convergence of the training process to a minimum and, in the case of non-convex criteria, on the quality of the local minima obtained: curriculum learning can be seen as a particular form of continuation method (a general strategy for global optimization of non-convex functions). training and remarkably increase the speed

There are no references for this article.