Machine Intelligence in Decoding of Forward Error Correction Codes

University essay from KTH/Teknisk informationsvetenskap

Author: Navneet Agrawal; [2017]

Keywords: ;

Abstract: A deep learning algorithm for improving the performance of the Sum-ProductAlgorithm (SPA) based decoders is investigated. The proposed Neural NetworkDecoders (NND) [22] generalizes the SPA by assigning weights to the edges ofthe Tanner graph. We elucidate the peculiar design, training, and working of theNND. We analyze the edge weight’s distribution of the trained NND and providea deeper insight into its working. The training process of NND learns the edgeweights in such a way that the effects of artifacts in the Tanner graph (such ascycles or trapping sets) are mitigated, leading to a significant improvement inperformance over the SPA.We conduct an extensive analysis of the training hyper-parameters affectingthe performance of the NND, and present hypotheses for determining theirappropriate choices for different families and sizes of codes. Experimental resultsare used to verify the hypotheses and rationale presented. Furthermore,we propose a new loss-function that improves performance over the standardcross-entropy loss. We also investigate the limitations of the NND in termsof complexity and performance. Although the SPA based design of the NNDenables faster training and reduced complexity, the design constraints restrictthe neural network to reach its maximum potential. Our experiments show thatthe NND is unable to reach Maximum Likelihood (ML) performance thresholdfor any plausible set of hyper-parameters. However for short length (n 128)High Density Parity Check (HDPC) codes such as Polar or BCH codes, theperformance improvement over the SPA is significant.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)