Essays about: "low-resource language"

Showing result 1 - 5 of 14 essays containing the words low-resource language.


    University essay from Göteborgs universitet / Institutionen för filosofi, lingvistik och vetenskapsteori

    Author : Liliia Makashova; [2021-09-23]
    Keywords : Speech synthesis; automatic speech recognition; low-resource language; machine learning; transfer learning;

    Abstract : Speech synthesis (text-to-speech, TTS) and speech recognition (automatic speech recognition, ASR) are the NLP technologies that are the least available for low-resource and indigenous languages. Lack of computational and data resources is the major obstacle when it comes to the development of linguistic tools for these languages. READ MORE

  2. 2. Low-resource Language Question Answering Systemwith BERT

    University essay from Mittuniversitetet/Institutionen för informationssystem och –teknologi

    Author : Herman Jansson; [2021]
    Keywords : BERT; Question Answering system; Reading Comprehension; Low resource language; SQuADv2;

    Abstract : The complexity for being at the forefront regarding information retrieval systems are constantly increasing. Recent technology of natural language processing called BERT has reached superhuman performance in high resource languages for reading comprehension tasks. READ MORE

  3. 3. Extractive Text Summarization of Norwegian News Articles Using BERT

    University essay from Linköpings universitet/Medie- och InformationsteknikLinköpings universitet/Tekniska fakulteten; Linköpings universitet/Medie- och InformationsteknikLinköpings universitet/Tekniska fakulteten

    Author : Thomas Indrias Biniam; Adam Morén; [2021]
    Keywords : extractive text summarization; NLP; deep learning; BERT; BERTSum; Multilingual BERT; Norwegian BERT; transformer; Norwegian; news articles;

    Abstract : Extractive text summarization has over the years been an important research area in Natural Language Processing. Numerous methods have been proposed for extracting information from text documents. Recent works have shown great success for English summarization tasks by fine-tuning the language model BERT using large summarization datasets. READ MORE

  4. 4. Automatic Speech Recognition for low-resource languages using Wav2Vec2 : Modern Standard Arabic (MSA) as an example of a low-resource language

    University essay from Högskolan Dalarna/Institutionen för information och teknik

    Author : Taha Zouhair; [2021]
    Keywords : Automatic Speech Recognition; Facebook Wav2Vec; Mozilla Common Voice; Low-Resource Language;

    Abstract : The need for fully automatic translation at DigitalTolk, a Stockholm-based company providing translation services, leads to exploring Automatic Speech Recognition as a first step for Modern Standard Arabic (MSA). Facebook AI recently released a second version of its Wav2Vec models, dubbed Wav2Vec 2. READ MORE

  5. 5. Large-Context Question Answering with Cross-Lingual Transfer

    University essay from Uppsala universitet/Institutionen för informationsteknologi

    Author : Markus Sagen; [2021]
    Keywords : Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Abstract : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. READ MORE