Essays about: "Multilingual question answering"

Showing result 1 - 5 of 8 essays containing the words Multilingual question answering.

  1. 1. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Xuecong Liu; [2022]
    Keywords : Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Abstract : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. READ MORE

  2. 2. Can Wizards be Polyglots: Towards a Multilingual Knowledge-grounded Dialogue System

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Evelyn Kai Yan Liu; [2022]
    Keywords : Knowledge-grounded dialogue; Dialogue systems; Generative question answering; Multilingual question answering; Multilingual dialogue systems; Transfer learning; Multi-task learning; Sequential training; Conversational AI; Natural Language Processing NLP ; Deep learning; Machine learning;

    Abstract : The research of open-domain, knowledge-grounded dialogue systems has been advancing rapidly due to the paradigm shift introduced by large language models (LLMs). While the strides have improved the performance of the dialogue systems, the scope is mostly monolingual and English-centric. READ MORE

  3. 3. Techniques for Multilingual Document Retrieval for Open-Domain Question Answering : Using hard negatives filtering, binary retrieval and data augmentation

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Carlos Lago Solas; [2022]
    Keywords : OpenQA; Multilingual Transformers; Document retrieval; Data augmentation.; OpenQA; Flerspråkiga Transformatorer; Dokumenthämtning; Dataförstärkning.;

    Abstract : Open Domain Question Answering (OpenQA) systems find an answer to a question from a large collection of unstructured documents. In this information era, we have an immense amount of data at our disposal. However, filtering all the content and trying to find the answers to our questions can be too time-consuming and ffdiicult. READ MORE

  4. 4. Low-resource Language Question Answering Systemwith BERT

    University essay from Mittuniversitetet/Institutionen för informationssystem och –teknologi

    Author : Herman Jansson; [2021]
    Keywords : BERT; Question Answering system; Reading Comprehension; Low resource language; SQuADv2;

    Abstract : The complexity for being at the forefront regarding information retrieval systems are constantly increasing. Recent technology of natural language processing called BERT has reached superhuman performance in high resource languages for reading comprehension tasks. READ MORE

  5. 5. Large-Context Question Answering with Cross-Lingual Transfer

    University essay from Uppsala universitet/Institutionen för informationsteknologi

    Author : Markus Sagen; [2021]
    Keywords : Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Abstract : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. READ MORE