A Novel Transfer Function for Continuous Interpolation between Summation and Multiplication in Neural Networks

University essay from KTH/Skolan för datavetenskap och kommunikation (CSC)

Abstract: In this work, we present the implementation and evaluation of a novel parameterizable transfer function for use in artificial neural networks. It allows the continuous change between summation and multiplication for the operation performed by a neuron. The transfer function is based on continuously differentiable fractional iterates of the exponential function and introduces an additional parameter per neuron and layer. This parameter can be determined along weights and biases during standard, gradient-based training. We evaluate the proposed transfer function within neural networks by comparing its performance to conventional transfer functions for various regression problems. Interpolation between summation and multiplication achieves comparable or even slightly better results, outperforming the latter on a task involving missing data and multiplicative interactions between inputs.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)