TY - GEN
T1 - Proving the efficacy of complementary inputs for multilayer neural networks
AU - Andersen, Timothy L.
PY - 2011
Y1 - 2011
N2 - This paper proposes and discusses a backpropagation-based training approach for multilayer networks that counteracts the tendency that typical backpropagation-based training algorithms have to favor examples that have large input feature values. This problem can occur in any real valued input space, and can create a surprising degree of skew in the learned decision surface even with relatively simple training sets. The proposed method involves modifying the original input feature vectors in the training set by appending complementary inputs, which essentially doubles the number of inputs to the network. This paper proves that this modification does not increase the network complexity, by showing that it is possible to map the network with complimentary inputs back into the original feature space.
AB - This paper proposes and discusses a backpropagation-based training approach for multilayer networks that counteracts the tendency that typical backpropagation-based training algorithms have to favor examples that have large input feature values. This problem can occur in any real valued input space, and can create a surprising degree of skew in the learned decision surface even with relatively simple training sets. The proposed method involves modifying the original input feature vectors in the training set by appending complementary inputs, which essentially doubles the number of inputs to the network. This paper proves that this modification does not increase the network complexity, by showing that it is possible to map the network with complimentary inputs back into the original feature space.
UR - http://www.scopus.com/inward/record.url?scp=80054746240&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2011.6033480
DO - 10.1109/IJCNN.2011.6033480
M3 - Conference contribution
AN - SCOPUS:80054746240
SN - 9781457710865
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 2062
EP - 2066
BT - 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program
T2 - 2011 International Joint Conference on Neural Network, IJCNN 2011
Y2 - 31 July 2011 through 5 August 2011
ER -