Evaluating Brain-Inspired Machine Learning Models for Time Series Forecasting: A Comparative Study on Dynamical Memory in Reservoir Computing and Neural Networks

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Eddie Nevander Hellström; Johan Slettengren; [2023]

Keywords: ;

Abstract: Brain-inspired computing is a promising research field, with potential to encouragebreakthroughs within machine learning and enable us to solve complex problems in a moreefficient way. This study aims to compare the performance of brain-like machine learningalgorithms for time series forecasting. Three models were implemented: a vanilla recurrentneural network (VRNN), a more brain-like reservoir computing (RC) model, as well as a timelagged version of the latter (TLRC). Additionally, an autoregressive integrated movingaverage (ARIMA) was implemented to obtain benchmark results, since this is a well-established model with no connection to the brain. The performances were evaluated on aspectrum of univariate and multivariate time series, ranging from chaotic benchmark data toexperimental data, such as temperature recordings. The results indicate that the reservoircomputing models generally outperform the recurrent neural network, and that the consideredmodels have a disadvantage in practical scenarios. A common factor for these algorithms isthat they exhibit a sense of dynamical memory. Hence, we compare these models not only interms of their predictive accuracy, but also in terms of their capability for memory. To reachan analysis of dynamical memory, this study investigated the usefulness of the algorithms onvarying amounts of available information. Lastly, the effect of the network/reservoir size wastaken into account. The research suggests that the models cannot benefit meaningfully fromhaving more neurons.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)