Essays about: "flerspråkiga transformatorer"

Found 5 essays containing the words flerspråkiga transformatorer.

  1. 1. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Xuecong Liu; [2022]
    Keywords : Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Abstract : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. READ MORE

  2. 2. Task-agnostic knowledge distillation of mBERT to Swedish

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Added Kina; [2022]
    Keywords : Natural Language Processing; Transformers; Knowledge Distillation; BERT; Multilingual Models; Cross-Lingual Transfer; Naturlig bearbetning av språk; Transformatorer; Kunskapsdestillation; BERT; Flerspråkiga modeller; Tvärspråklig inlärningsöverföring;

    Abstract : Large transformer models have shown great performance in multiple natural language processing tasks. However, slow inference, strong dependency on powerful hardware, and large energy consumption limit their availability. READ MORE

  3. 3. Techniques for Multilingual Document Retrieval for Open-Domain Question Answering : Using hard negatives filtering, binary retrieval and data augmentation

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Carlos Lago Solas; [2022]
    Keywords : OpenQA; Multilingual Transformers; Document retrieval; Data augmentation.; OpenQA; Flerspråkiga Transformatorer; Dokumenthämtning; Dataförstärkning.;

    Abstract : Open Domain Question Answering (OpenQA) systems find an answer to a question from a large collection of unstructured documents. In this information era, we have an immense amount of data at our disposal. However, filtering all the content and trying to find the answers to our questions can be too time-consuming and ffdiicult. READ MORE

  4. 4. QPLaBSE: Quantized and Pruned Language-Agnostic BERT Sentence Embedding Model : Production-ready compression for multilingual transformers

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Sarthak Langde; [2021]
    Keywords : Transformers; LaBSE; Quantization; Pruning; PyTorch; TensorFlow; ONNX; Transformatorer; LaBSE; Kvantisering; Beskärning; PyTorch; TensorFlow; ONNX;

    Abstract : Transformer models perform well on Natural Language Processing and Natural Language Understanding tasks. Training and fine-tuning of these models consume a large amount of data and computing resources. Fast inference also requires high-end hardware for user-facing products. READ MORE

  5. 5. DistillaBSE: Task-agnostic  distillation of multilingual sentence  embeddings : Exploring deep self-attention distillation with switch transformers

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Boris Bubla; [2021]
    Keywords : Transformers; Knowledge Distillation; Language Agnostic BERT Sentence Embeddings; Natural Language Processing; Switch Transformers; Transformatorer; kunskapsdestillation; språkagnostisk inbäddning av BERT- mening; naturlig bearbetning av språk; switchtransformatorer;

    Abstract : The recent development of massive multilingual transformer networks has resulted in drastic improvements in model performance. These models, however, are so large they suffer from large inference latency and consume vast computing resources. Such features hinder widespread adoption of the models in industry and some academic settings. READ MORE