Latency Prediction in 5G Networks by using Machine Learning

University essay from Lunds universitet/Institutionen för reglerteknik

Abstract: This thesis presents a report of predicting latency in a 5G network by using deep learning techniques. The training set contained data of network parameters along with the actual latency, collected in a 5G lab environment during four different test scenarios. We trained four different machine learning models, including Forward Neural Network (FNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). After the initial model implementation, each model was refined by using Bayesian optimization for Hyperparameter Optimization (HPO). In addition, both the standard mean squared error (MSE) and a custom asymmetric version of the mean squared error (AMSE) were used as loss functions. Overall, it was possible to predict the latency behavior for all models, although the FNN model was reactive rather than predictive and therefore not suitable for this task. Before the Bayesian optimization the models excluding FNN had a R2 score of 0.88−0.95, and after Bayesian optimization the score increased to 0.96−0.98 for the first data set. According to research, custom loss functions can be used to make the models even more suitable for practical use by penalizing underpredictions more severely than overpredictions.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)