1. Field
Certain aspects of the present disclosure generally relate to neural system engineering and, more particularly, to a method and apparatus for training a computational network using a local training rule that creates sparse connectivity.
2. Background
A developing brain of humans and animals undergoes a synaptic growth spurt in early childhood followed by a massive synaptic pruning, which removes about half of the synapses by adulthood. Synaptic rewiring (structural plasticity) continues in mature brain but at a slower rate. The synaptic pruning is found to be activity dependent and to remove weaker synapses. Because of that, it may be explained by a synaptic plasticity, in which synapses compete for finite resources such as neurotrophic factors. Synaptic pruning helps to increase the brain efficiency, which can be generally defined as the same functionality with fewer synapses. Since transmission of signals through synapses requires energy, a higher efficiency also means a lower energy.
Existing unsupervised learning rules model the synaptic competition for limited resources either explicitly, by the multiplicative or subtractive normalization, or implicitly. However, the explicit normalizations may be nonlocal, i.e., they require the knowledge of all input weights of a neuron to update each one of them individually. However, this may not be biologically plausible. The Oja rule, on the other hand, uses only local information available to a synapse to compute its weight update, but it asymptotically constrains the sum of squared weights, which does not have a biological justification.