Deep Learning for Error Prediction In MIMO-OFDM system With Maximum Likelihood Detector
Abstract: To increase link throughput in multi-input multi-output (MIMO) orthogonal frequencydivision multiplexing (OFDM) systems, transmission parameters such as code rate andmodulation order are required to be set adaptively. Therefore, block error rate (BLER)becomes a crucial measure which illustrates the quality of the link, thus being used in LinkAdaptation (LA) to determine the transmission parameters. However, existing methods topredict BLER are only valid for linear detectors, e.g. Minimum Mean Square Error (MMSE)detector [1]. In this thesis, we show that signal-to-interference-plus-noise ratio (SINR)exists in MIMO-OFDM system with MLD (maximum likelihood detection). Then, a machinelearning based method with Deep Neural Network (DNN) is proposed to analyze therelation between input features (channel matrix, modulation and coding scheme (MCS),signal-to-noise ratio(SNR)) and labels (CRC). Results shows that the classification of DNNis good. However, there is still deviation when compared output of DNN with thesimulated BLER.
AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)