Bootstrapping Annotated Job Ads using Named Entity Recognition and Swedish Language Models

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: Named entity recognition (NER) is a task that concerns detecting and categorising certain information in text. A promising approach for NER that recently has emerged is fine-tuning Transformer-based language models for this specific task. However, these models may require a relatively large quantity of labelled data to perform well. This can limit NER models applicability in real-world applications as manual annotation often is costly and time-consuming. In this thesis, we investigate the learning curve of human annotation and of a NER model during a semi-supervised bootstrapping process. Special emphasis is given the dependence of the number of classes and the amount of training data used in the process. We first annotate a set of collected job advertisements and then apply bootstrapping using both annotated and unannotated data and continuously fine-tune a pre-trained Swedish BERT model. The initial class system is simplified during the bootstrapping process according to model performance and inter-annotator agreement. The model performance increased as the training set grew larger with a final micro F1-score of 54%. This result provides a good baseline, and we point out several improvements that can be made to further enhance performance. We further identify classes handled differently by the annotators and potential factors as to why this is. Suggestions for future work include adjusting the current class system further by removing classes that were identified as low-performing in this thesis. 

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)