Exploring Column Update Elimination Optimization for Spike-Timing-Dependent Plasticity Learning Rule

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: Hebbian learning based neural network learning rules when implemented on hardware, store their synaptic weights in the form of a two-dimensional matrix. The storage of synaptic weights demands large memory bandwidth and storage. While memory units are optimized for only row-wise memory access, Hebbian learning rules, like the spike-timing dependent plasticity, demand both row and column-wise access of memory. This dual pattern of memory access accounts for the dominant cost in terms of latency as well as energy for realization of large scale spiking neural networks in hardware. In order to reduce the memory access cost in Hebbian learning rules, a Column Update Elimination optimization has been previously implemented, with great efficacy, on the Bayesian Confidence Propagation neural network, that faces a similar challenge of dual pattern memory access. This thesis explores the possibility of extending the column update elimination optimization to spike-timing dependent plasticity, by simulating the learning rule on a two layer network of leaky integrate-and-fire neurons on an image classification task. The spike times are recorded for each neuron in the network, to derive a suitable probability distribution function for spike rates per neuron. This is then used to derive an ideal postsynaptic spike history buffer size for the given algorithm. The associated memory access reductions are analysed based on data to assess feasibility of the optimization to the learning rule. 

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)