Essays about: "FastText"
Showing result 11 - 15 of 24 essays containing the word FastText.
-
11. Automatic language identification of short texts
University essay from Uppsala universitet/Avdelningen för beräkningsvetenskapAbstract : The world is growing more connected through the use of online communication, exposing software and humans to all the world's languages. While devices are able to understand and share the raw data between themselves and with humans, the information itself is not expressed in a monolithic format. READ MORE
-
12. Data Augmentation in Solving Data Imbalance Problems
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : This project mainly focuses on the various methods of solving data imbalance problems in the Natural Language Processing (NLP) field. Unbalanced text data is a common problem in many tasks especially the classification task, which leads to the model not being able to predict the minority class well. READ MORE
-
13. Sentiment analysis of movie reviews in Chinese
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : Sentiment analysis aims at figuring out the opinions of the users towards a certain service or product. In this research, the aim is at classifying the sentiments of users based on the comments they have posed on Douban movie website. READ MORE
-
14. Using NLP Techniques for Log Analysis to Recommend Activities For Troubleshooting Processes
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Continuous Integration is the practice of building and testing software every time a code change is merged into its entire codebase. At the merge, the source code is compiled, dependencies are resolved, and test cases are executed. READ MORE
-
15. Swedish NLP Solutions for Email Classification
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Assigning categories to text communications is a common task of Natural Language Processing (NLP). In 2018, a new deep learning language repre- sentation model, Bidirectional Encoder Representations from Transformers (BERT), was developed which can make inferences from text without task specific architecture. READ MORE