Speed training: Improving the rate of backpropagation learning through stochastic sample presentation

M. E. Rimer, T. L. Andersen, T. R. Martinez

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

Artificial neural networks provide an effective empirical predictive model for pattern classification. However, using complex neural networks to learn very large training sets is often problematic, imposing prohibitive time constraints on the training process. We present four practical methods for dramatically decreasing training time through dynamic stochastic sample presentation, a technique we call speed training. These methods are shown to be robust to retaining generalization accuracy over a diverse collection of real world data sets. In particular, the SET technique achieves a training speedup of 4278% on a large OCR database with no detectable loss in generalization.

Original languageEnglish
Pages2661-2666
Number of pages6
StatePublished - 2001
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: 15 Jul 200119 Jul 2001

Conference

ConferenceInternational Joint Conference on Neural Networks (IJCNN'01)
Country/TerritoryUnited States
CityWashington, DC
Period15/07/0119/07/01

Fingerprint

Dive into the research topics of 'Speed training: Improving the rate of backpropagation learning through stochastic sample presentation'. Together they form a unique fingerprint.

Cite this