Essays about: "QA Model"

Showing result 11 - 15 of 19 essays containing the words QA Model.

  1. 11. Large-Context Question Answering with Cross-Lingual Transfer

    University essay from Uppsala universitet/Institutionen för informationsteknologi

    Author : Markus Sagen; [2021]
    Keywords : Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Abstract : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. READ MORE

  2. 12. Investigating the Effect of Complementary Information Stored in Multiple Languages on Question Answering Performance : A Study of the Multilingual-T5 for Extractive Question Answering

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Björn Aurell Hansson; [2021]
    Keywords : Machine learning; Transformers; multilingual-T5; question answering; NLP; Maskininlärning; transformatormodeller; frågeställning; naturlig språkbehandling.;

    Abstract : Extractive question answering is a popular domain in the field of natural language processing, where machine learning models are tasked with answering questions given a context. Historically the field has been centered on monolingual models, but recently more and more multilingual models have been developed, such as Google’s MT5 [1]. READ MORE

  3. 13. Bidirectional Encoder Representations from Transformers (BERT) for Question Answering in the Telecom Domain. : Adapting a BERT-like language model to the telecom domain using the ELECTRA pre-training approach

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Henrik Holm; [2021]
    Keywords : Deep Learning; Natural Language Understanding; Transformers; Language Models; Representation Learning; Domain Adaption; Representationsinlärning; Djupinlärning; Språkteknologi; Transformatorer; Språkmodeller; Domänanpassning;

    Abstract : The Natural Language Processing (NLP) research area has seen notable advancements in recent years, one being the ELECTRA model which improves the sample efficiency of BERT pre-training by introducing a discriminative pre-training approach. Most publicly available language models are trained on general-domain datasets. READ MORE

  4. 14. Characterization of Radiomics Features Extracted from Images Generated by the 0.35 T Scanner of an Integrated MRI-Linac

    University essay from Lunds universitet/Sjukhusfysikerutbildningen

    Author : Rebecka Ericsson Szecsenyi; [2020]
    Keywords : Medicine and Health Sciences;

    Abstract : Purpose: In an era of personalized oncology where the aim is to give every patient the right treatment at the right time an area of promising research is emerging called radiomics, or quantitative image analysis. The main underlying hypothesis is that pathophysiological information can be found in image texture not visible to the bare eye that can improve diagnosis, treatment adaption or be linked to a certain clinical outcome. READ MORE

  5. 15. Surmize: An Online NLP System for Close-Domain Question-Answering and Summarization

    University essay from Uppsala universitet/Institutionen för informationsteknologi

    Author : Alexander Bergkvist; Nils Hedberg; Sebastian Rollino; Markus Sagen; [2020]
    Keywords : Summary; Summarization; Abstractive Summarization; Extractive Summarization; ASUS; ESUS; Question Answering; Question-Answering; QA; QA Model; QA System; Natural Language Processing; NLP; Online NLP System; Machine Learning; ML; Deep Learning; DL; Close-Domain Question Answering; cdQA; Transformer Model; transformer; Transformer; BERT; Watson; Online QA; Online Summary; Online Summarization; Spacy; FastAPI; Surmize; Huggingface;

    Abstract : The amount of data available and consumed by people globally is growing. To reduce mental fatigue and increase the general ability to gain insight into complex texts or documents, we have developed an application to aid in this task. READ MORE