Evaluating the Effects of Neural Noise in the Multidigraph Learning Rule

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Gustav Bressler; Sigvard Dackevall; [2023]

Keywords: ;

Abstract: There exists a knowledge gap in the field of Computational Neuroscience, where many learning models for neural networks fail to take into account the influence of neural noise. The purpose of this thesis was to address this knowledge gap by investigating the robustness of the Multidigraph learning rule (MDGL) when exposed to two kinds of neural noise: external noise and internal noise. The external noise was introduced as a random Poisson process in the form of network input. The internal noise was implemented by a Gaussian process and applied from within every recurrent neuron inside the network. A recurrent spiking neural network model was used. As a benchmark, the results for MDGL were compared to BPTT. The result showed that BPTT was more robust against both internal and external noise than MDGL. The accuracy and loss of the MDGL trained network were impacted negatively by both kinds of noise. Limitations in the work were small amounts of iterations in the simulations. Also, only a smaller fixed neural network was used, as opposed to neural networks of varying sizes.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)