Emotion Classification with Natural Language Processing (Comparing BERT and Bi-Directional LSTM models for use with Twitter conversations)

University essay from Lunds universitet/Matematisk statistik

Abstract: We have constructed a novel neural network architecture called CWE-LSTM (concatenated word-emoji bidirectional long short-term memory) for classify- ing emotions in Twitter conversations. The architecture is based on a combina- tion of word and emoji embeddings with domain specificity in Twitter data. Its performance is compared to a current state of the art natural language process- ing model from Google, BERT. We show that CWE-LSTM is more successful at classifying emotions in Twitter conversations than BERT (F 1 73 versus 69). Fur- thermore, we hypothesize why this type of problem’s domain specificity makes it a poor candidate for transfer learning with BERT. This is to further detail the discussion between large, general models and slimmer, domain specific models in the field of natural language processing.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)