Essays about: "naturligtspråkbehandling"
Showing result 1 - 5 of 6 essays containing the word naturligtspråkbehandling.
-
1. Classification of invoices using a 2D NLP approach : A comparison between methods for invoice information extraction for the purpose of classification
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Many companies are handling a large number of invoices every year. To manually categorize them takes a lot of time and resources. For a model to automatically categorize invoices, the documents need to be properly read and processed by the model. READ MORE
-
2. Evaluating the robustness of DistilBERT to data shift in toxicity detection
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : With the rise of social media, cyberbullying and online spread of hate have become serious problems with devastating consequences. Mentimeter is an interactive presentation tool enabling the presentation audience to participate by typing their own answers to questions asked by the presenter. READ MORE
-
3. Classifying and Comparing Latent Space Representation of Unstructured Log Data.
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : This thesis explores and compares various methods for producing vector representation of unstructured log data. Ericsson wanted to investigate machine learning methods to analyze logs produced by their systems to reduce the cost and effort required for manual log analysis. READ MORE
-
4. Automating Question Generation Given the Correct Answer
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : In this thesis, we propose an end-to-end deep learning model for a question generation task. Given a Wikipedia article written in English and a segment of text appearing in the article, the model can generate a simple question whose answer is the given text segment. The model is based on an encoder-decoder architecture. READ MORE
-
5. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). READ MORE