LDPC DropConnect

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: Machine learning is a popular topic that has become a scientific research tool in many fields. Overfitting is a common challenge in machine learning, where the model fits the training data too well and performs poorly on new data. Stochastic regularization is one method used to prevent overfitting, by artificially constraining the model to be simpler. In this thesis, we investigate the use of tools from information and coding theory as regularization methods in machine learning. The motivation for this project comes from recent results that successfully related generalization capability of learning algorithms to the information stored in the model parameters. This has led us to explore the use of stochastic regularization techniques like Dropout and DropConnect, which add sparsity to the networks and can help control and limit the information that the parameters store on the training data. Specifically, we explore the use of parity-check matrices from coding theory as masks in the DropConnect method. Parity-check matrices describe linear relations that codewords must satisfy, and have been shown to perform well as measurement matrices in compressed sensing. We build a new family of neural networks that apply Low-Density Parity-Check (LDPC) matrices as DropConnect masks, so-called Low-Density Parity-Check DropConnect (LDPC DropConnect). We evaluate the performance of this neural network with popular datasets in classification and track the generalization capability with statistics of the LDPC matrices. Our experiments show that adopting LDPC matrices does not significantly improve the generalization performance, but it helps provide a more robust evidence lower bound in the Bayesian approach. Our work may provide insights for further research on applying machine learning in compressed sensing, distributed computation, and other related areas.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)