Hebbnet: A Simplified Hebbian Learning Framework To Do Biologically Plausible Learning
Manas Gupta, Arulmurugan Ambikapathi, Savitha Ramasamy
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:08:26
Backpropagation has revolutionized neural network training however, its biological plausibility remains questionable. Hebbian learning, a completely unsupervised and feedback free learning technique is a strong contender for a biologically plausible alternative. However, so far, it has either not achieved high accuracy performance vs. backprop or the training procedure has been very complex. In this work, we introduce a new Hebbian learning based neural network, called HebbNet. At the heart of HebbNet is a new Hebbian learning rule, that we build-up from first principles, by adding two novel algorithmic updates to the basic Hebbian learning rule. This new rule makes Hebbian learning substantially simpler, while also improving performance. Compared to state-of-the-art, we improve training dynamics by reducing the number of training epochs from 1500 to 200 and making training a one-step process from a two-step process. We also reduce heuristics by reducing hyper-parameters from 5 to 1, and number of search runs for hyper-parameter tuning from 12,600 to 13. Notwithstanding this, HebbNet still achieves strong test performance on MNIST and CIFAR-10 datasets vs. state-of-the-art.
Chairs:
Robert Jenssen