Essays about: "Frågebesvaring"
Found 5 essays containing the word Frågebesvaring.
-
1. Synthetic data generation for domain adaptation of a retriever-reader Question Answering system for the Telecom domain : Comparing dense embeddings with BM25 for Open Domain Question Answering
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Having computer systems capable of answering questions has been a goal within Natural Language Processing research for many years. Machine Learning systems have recently become increasingly proficient at this task with large language models obtaining state-of-the-art performance. READ MORE
-
2. Prompt engineering and its usability to improve modern psychology chatbots
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : As advancements in chatbots and Large Language Models (LLMs) such as GPT-3.5 and GPT-4 continue, their applications in diverse fields, including psychology, expand. This study investigates the effectiveness of LLMs optimized through prompt engineering, aiming to enhance their performance in psychological applications. READ MORE
-
3. Investigating the Effect of Complementary Information Stored in Multiple Languages on Question Answering Performance : A Study of the Multilingual-T5 for Extractive Question Answering
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Extractive question answering is a popular domain in the field of natural language processing, where machine learning models are tasked with answering questions given a context. Historically the field has been centered on monolingual models, but recently more and more multilingual models have been developed, such as Google’s MT5 [1]. READ MORE
-
4. Bidirectional Encoder Representations from Transformers (BERT) for Question Answering in the Telecom Domain. : Adapting a BERT-like language model to the telecom domain using the ELECTRA pre-training approach
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : The Natural Language Processing (NLP) research area has seen notable advancements in recent years, one being the ELECTRA model which improves the sample efficiency of BERT pre-training by introducing a discriminative pre-training approach. Most publicly available language models are trained on general-domain datasets. READ MORE
-
5. A Method for Automatic Question Answering in Swedish based on BERT
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : This report presents a method for doing automatic reading comprehension in Swedish. The method is based on BERT, a pre-trained Swedish neuralnetwork language model, which was fine-tuned on a Swedish question-answer corpus. READ MORE