TY - GEN
T1 - Continuous Learning in a Single-Incremental-Task Scenario with Spike Features
AU - Vaila, Ruthvik
AU - Chiasson, John
AU - Saxena, Vishal
N1 - Publisher Copyright:
© 2020 ACM.
PY - 2020/7/28
Y1 - 2020/7/28
N2 - Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing and their inability to perform sequential learning, that is, when a DNN is trained on a first task and the same DNN is trained on the next task it forgets the first task. This phenomenon of forgetting previous tasks is also referred to as catastrophic forgetting. On the other hand a mammalian brain outperforms DNNs in terms of energy efficiency and the ability to learn sequentially without catastrophically forgetting. Here, we use bio-inspired Spike Timing Dependent Plasticity (STDP) in the feature extraction layers of the network with instantaneous neurons to extract meaningful features. In the classification sections of the network we use a modified synaptic intelligence that we refer to as cost per synapse metric as a regularizer to immunize the network against catastrophic forgetting in a Single-Incremental-Task scenario (SIT). In this study, we use MNIST handwritten digits dataset that was divided into five sub-tasks.
AB - Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing and their inability to perform sequential learning, that is, when a DNN is trained on a first task and the same DNN is trained on the next task it forgets the first task. This phenomenon of forgetting previous tasks is also referred to as catastrophic forgetting. On the other hand a mammalian brain outperforms DNNs in terms of energy efficiency and the ability to learn sequentially without catastrophically forgetting. Here, we use bio-inspired Spike Timing Dependent Plasticity (STDP) in the feature extraction layers of the network with instantaneous neurons to extract meaningful features. In the classification sections of the network we use a modified synaptic intelligence that we refer to as cost per synapse metric as a regularizer to immunize the network against catastrophic forgetting in a Single-Incremental-Task scenario (SIT). In this study, we use MNIST handwritten digits dataset that was divided into five sub-tasks.
KW - catastrophic forgetting
KW - feature extraction
KW - fisher information
KW - neural networks
KW - single-incremental-task
KW - STDP
UR - http://www.scopus.com/inward/record.url?scp=85091503748&partnerID=8YFLogxK
U2 - 10.1145/3407197.3407213
DO - 10.1145/3407197.3407213
M3 - Conference contribution
AN - SCOPUS:85091503748
T3 - ACM International Conference Proceeding Series
BT - ICONS 2020 - Proceedings of International Conference on Neuromorphic Systems 2020
T2 - 2020 International Conference on Neuromorphic Systems, ICONS 2020
Y2 - 28 July 2020 through 30 July 2020
ER -