An evaluation of BERT for a Span-based Approach for Jointly Predicting Entities, Coreference Clusters and Relations Between Entities

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Ulme Wennberg; [2019]

Keywords: ;

Abstract: This degree project examines and evaluates the performance of various ways of improving contextualization of text span representations within a general multi-task learning framework for named entity recognition, coreferencere solution and relation extraction. A span-based approach is used in which all possible text spans are enumerated, iteratively refined and finally scored. This work examines which ways of contextualizing the span representations are beneficial when using the text embedder BERT. Furthermore, I evaluate to what degree graph propagations can be used together with BERT to enhance performance further, and observe F1-score improvements over previous work. The architecture sets new state-of-the-art results on four datasets from different domains - SciERC, ACE2005, GENIA and WLPC. Qualitative examples are provided to highlight model behaviour and reasons for the improvements are discussed.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)