Removing decision surface skew using complementary inputs

Timothy L. Andersen

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper examines the tendency of backpropagation-based training algorithms to favor examples that have large input feature values, in terms of the ability of such examples to influence the weights of the network, and shows that this tendency can lead to sub-optimal decision surfaces. We propose a method for counteracting this tendency that modifies the original input feature vector through the addition of complementary inputs.

Original languageEnglish
Pages263-267
Number of pages5
StatePublished - 2002
Event2002 International Joint Conference on Neural Networks (IJCNN '02) - Honolulu, HI, United States
Duration: 12 May 200217 May 2002

Conference

Conference2002 International Joint Conference on Neural Networks (IJCNN '02)
Country/TerritoryUnited States
CityHonolulu, HI
Period12/05/0217/05/02

Fingerprint

Dive into the research topics of 'Removing decision surface skew using complementary inputs'. Together they form a unique fingerprint.

Cite this