Essays about: "multilingual pre-trained language models"

Showing result 1 - 5 of 16 essays containing the words multilingual pre-trained language models.

  1. 1. Ensuring Brand Safety by Using Contextual Text Features: A Study of Text Classification with BERT

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Lingqing Song; [2023]
    Keywords : ;

    Abstract : When advertisements are placed on web pages, the context in which the advertisements are presented is important. For example, manufacturers of kitchen knives may not want their advertisement to appear in a news article about a knife-wielding murderer. READ MORE

  2. 2. Monolingual and Cross-Lingual Survey Response Annotation

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Yahui Zhao; [2023]
    Keywords : transfer learning; zero-shot cross-lingual transfer; model-based transfer; multilingual pre-trained language models; sequence labeling; open-ended questions; democracy;

    Abstract : Multilingual natural language processing (NLP) is increasingly recognized for its potential in processing diverse text-type data, including those from social media, reviews, and technical reports. Multilingual language models like mBERT and XLM-RoBERTa (XLM-R) play a pivotal role in multilingual NLP. READ MORE

  3. 3. Exploring Cross-Lingual Transfer Learning for Swedish Named Entity Recognition : Fine-tuning of English and Multilingual Pre-trained Models

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Daniel Lai Wikström; Axel Sparr; [2023]
    Keywords : NER; Cross-lingual transfer; Transformer; BERT; Deep Learning; namnigenkänning; NER; multilingvistisk överföring; Transformer; BERT; deep learning;

    Abstract : Named Entity Recognition (NER) is a critical task in Natural Language Processing (NLP), and recent advancements in language model pre-training have significantly improved its performance. However, this improvement is not universally applicable due to a lack of large pre-training datasets or computational budget for smaller languages. READ MORE

  4. 4. Can Wizards be Polyglots: Towards a Multilingual Knowledge-grounded Dialogue System

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Evelyn Kai Yan Liu; [2022]
    Keywords : Knowledge-grounded dialogue; Dialogue systems; Generative question answering; Multilingual question answering; Multilingual dialogue systems; Transfer learning; Multi-task learning; Sequential training; Conversational AI; Natural Language Processing NLP ; Deep learning; Machine learning;

    Abstract : The research of open-domain, knowledge-grounded dialogue systems has been advancing rapidly due to the paradigm shift introduced by large language models (LLMs). While the strides have improved the performance of the dialogue systems, the scope is mostly monolingual and English-centric. READ MORE

  5. 5. Task-agnostic knowledge distillation of mBERT to Swedish

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Added Kina; [2022]
    Keywords : Natural Language Processing; Transformers; Knowledge Distillation; BERT; Multilingual Models; Cross-Lingual Transfer; Naturlig bearbetning av språk; Transformatorer; Kunskapsdestillation; BERT; Flerspråkiga modeller; Tvärspråklig inlärningsöverföring;

    Abstract : Large transformer models have shown great performance in multiple natural language processing tasks. However, slow inference, strong dependency on powerful hardware, and large energy consumption limit their availability. READ MORE