Effects of Network Size in a Recurrent Bayesian Confidence Propagating Neural Network With two Synaptic Traces

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: A modular Recurrent Bayesian Confidence PropagatingNeural Networks (BCPNN) with two synaptic time tracesis a computational neural network that can serve as a modelof biological short term memory. The units in the network aregrouped into modules called hypercolumns within which there isa competitive winner-takes-all mechanism.In this work, the network’s capacity to store sequentialmemories is investigated while varying the size of and numberof hyperocolumns in the network. The network is trained on setsof temporal sequences where each sequence consist of a set ofsymbols represented as semi-stable attractor state patterns in thenetwork and evaluated by its ability to later recall the sequences.For a given distribution of training sequence the networks’ability to store and recall sequences was seen to significantlyincrease with the size of the hypercolumns. As the number ofhypercolumns was increased, the storage capacity increased upto a clear level in most cases. After this point it was observedto remain constant and did not improve by adding any morehypercolumns (for a given sequence distribution). The storagecapacity was also seen to depend a lot on the distribution of thesequences.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)