Reducing Inter-cell Interference Using Machine Learning

University essay from Lunds universitet/Institutionen för elektro- och informationsteknik

Abstract: One way of meeting the increasing demand for higher data rates is by building denser cellular networks in order to maximize the use of the frequency spectrum. The denser deployment leads to an increased probability of inter-cell interference, the phenomenon where the signal quality experienced by a user served by a cell is lowered because of the transmission from neighboring cells in the network. Coordinated multi-point techniques such as dynamic point blanking (DPB) can be used to dynamically mute resources in the network and increase the possible capacity in the system. The possible permutations of muting patterns are in this thesis found by using a search tree structure and the optimal pattern is found by evaluating each node in the tree. In some scenarios it not necessary to evaluate all nodes in order to find a muting pattern that results in satisfying performance, making it possible to save computational power. In this thesis the three machine learning models, logistic regression, support vector machine and naive Bayes, have been used to binary classify the search width needed in order to decrease the mean interference experienced by users below a certain threshold. The data needed to train the models was generated in an Ericsson simulator, where features were extracted from the output logs. Results show that the support vector machine is the most successful of the three and it was able to predict the search width in 87 % of the samples. The conclusion is that machine learning techniques can be used to predict the optimal search depth in a given scenario, but more research has to be done in order to quantify the computational gain in reducing the number of evaluated muting patterns.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)