Essays about: "Multilingual Transformer"

Showing result 1 - 5 of 16 essays containing the words Multilingual Transformer.

  1. 1. Exploring Cross-Lingual Transfer Learning for Swedish Named Entity Recognition : Fine-tuning of English and Multilingual Pre-trained Models

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Daniel Lai Wikström; Axel Sparr; [2023]
    Keywords : NER; Cross-lingual transfer; Transformer; BERT; Deep Learning; namnigenkänning; NER; multilingvistisk överföring; Transformer; BERT; deep learning;

    Abstract : Named Entity Recognition (NER) is a critical task in Natural Language Processing (NLP), and recent advancements in language model pre-training have significantly improved its performance. However, this improvement is not universally applicable due to a lack of large pre-training datasets or computational budget for smaller languages. READ MORE

  2. 2. Multilingual Transformer Models for Maltese Named Entity Recognition

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Kris Farrugia; [2022]
    Keywords : low-resource; named-entity; information extraction; Maltese;

    Abstract : The recently developed state-of-the-art models for Named Entity Recognition are heavily dependent upon huge amounts of available annotated data. Consequently, it is extremely challenging for data-scarce languages to obtain significant result. READ MORE

  3. 3. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Xuecong Liu; [2022]
    Keywords : Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Abstract : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. READ MORE

  4. 4. Analysis of Syntactic Behaviour of Neural Network Models by Using Gradient-Based Saliency Method : Comparative Study of Chinese and English BERT, Multilingual BERT and RoBERTa

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Jiayi Zhang; [2022]
    Keywords : neural network models; gradient-based saliency; BERT; mBERT; RoBERTa;

    Abstract : Neural network models such as Transformer-based BERT, mBERT and RoBERTa are achieving impressive performance (Devlin et al., 2019; Lewis et al., 2020; Liu et al., 2019; Raffel et al. READ MORE

  5. 5. Unsupervised multilingual distractor generation for fill-in-the-blank questions

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Zhe Han; [2022]
    Keywords : Multilingual; Distractor; BERT;

    Abstract : Fill-in-the-blank multiple choice questions (MCQs) play an important role in the educational field, but the manual generation of them is quite resource-consuming, so it has gradually turned into an attractive NLP task. Thereinto, question creation itself has become a mainstream NLP research topic, while distractor (wrong alternative) generation (DG) still remains out of the spotlight. READ MORE