Cortex inspired network architectures for spatio-temporal information processing

University essay from KTH/Skolan för datavetenskap och kommunikation (CSC)

Author: Tin Franovic; [2013]

Keywords: ;

Abstract: The abundance of high-dimensional datasets provides scientists with a strong foundation in their research. With high-performance computing platforms becoming increasingly available and more powerful, large-scale data processing represents an important step toward modeling and understanding the underlying processes behind such data. In this thesis, we propose a general cortex-inspired information processing network architecture capable of capturing spatio-temporal correlations in data and forming distributed representations as cortical activation patterns. The proposed architecture has a modular and multi-layered organization which is efficiently parallelized to allow large-scale computations. The network allows unsupervised processing of multivariate stochastic time series, irregardless of the data source, producing a sparse de-correlated representation of the input features expanded by time delays. The features extracted by the architecture are then used for supervised learning with Bayesian confidence propagation neural networks and evaluated on speech classification and recognition tasks. Due to their rich temporal dynamics, we exploited auditory signals for speech recognition as an use case for performance evaluation. In terms of classification performance, the proposed architecture outperforms modern machine-learning methods such as support vector machines and obtains results comparable to other stateof-the-art speech recognition methods. The potential of the proposed scalable cortex-inspired approach to capture meaningful multivariate temporal correlations and provide insight into the model-free high- dimensional data decomposition basis is expected to be of particular use in the analysis of large brain signal datasets such as EEG or MEG.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)