Essays about: "kunskapsdestillation"

Showing result 1 - 5 of 7 essays containing the word kunskapsdestillation.

  1. 1. Efficient Sentiment Analysis and Topic Modeling in NLP using Knowledge Distillation and Transfer Learning

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : George Malki; [2023]
    Keywords : Large Language Model; RoBERTa; Knowledge distillation; Transfer learning; Sentiment analysis; Topic modeling; Stor språkmodell; RoBERTa; Kunskapsdestillation; överföringsinlärning; Sentimentanalys; Ämnesmodellering;

    Abstract : This abstract presents a study in which knowledge distillation techniques were applied to a Large Language Model (LLM) to create smaller, more efficient models without sacrificing performance. Three configurations of the RoBERTa model were selected as ”student” models to gain knowledge from a pre-trained ”teacher” model. READ MORE

  2. 2. Knowledge Distillation of DNABERT for Prediction of Genomic Elements

    University essay from KTH/Skolan för kemi, bioteknologi och hälsa (CBH)

    Author : Joana Palés Huix; [2022]
    Keywords : Knowledge distillation; Transformers; BERT; Genomics; Promoter identification; Explainability;

    Abstract : Understanding the information encoded in the human genome and the influence of each part of the DNA sequence is a fundamental problem of our society that can be key to unveil the mechanism of common diseases. With the latest technological developments in the genomics field, many research institutes have the tools to collect massive amounts of genomic data. READ MORE

  3. 3. Task-agnostic knowledge distillation of mBERT to Swedish

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Added Kina; [2022]
    Keywords : Natural Language Processing; Transformers; Knowledge Distillation; BERT; Multilingual Models; Cross-Lingual Transfer; Naturlig bearbetning av språk; Transformatorer; Kunskapsdestillation; BERT; Flerspråkiga modeller; Tvärspråklig inlärningsöverföring;

    Abstract : Large transformer models have shown great performance in multiple natural language processing tasks. However, slow inference, strong dependency on powerful hardware, and large energy consumption limit their availability. READ MORE

  4. 4. Exploration of Knowledge Distillation Methods on Transformer Language Models for Sentiment Analysis

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Haonan Liu; [2022]
    Keywords : Natural Language Processing; Sentiment Analysis; Language Model; Transformers; Knowledge Distillation; Behandling av Naturligt Språk; Analys av Känslor; Språkmodell; Omvandlare; Kunskapsdestillation;

    Abstract : Despite the outstanding performances of the large Transformer-based language models, it proposes a challenge to compress the models and put them into the industrial environment. This degree project explores model compression methods called knowledge distillation in the sentiment classification task on Transformer models. READ MORE

  5. 5. Knowledge Distillation for Semantic Segmentation and Autonomous Driving. : Astudy on the influence of hyperparameters, initialization of a student network and the distillation method on the semantic segmentation of urban scenes.

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Juan Sanchez Nieto; [2022]
    Keywords : Knowledge Distillation; Autonomous Driving; Semantic Segmentation; Cityscapes.; Kunskapsdestillation; Autonom Körning; Semantisk Segmentering; Stadslandskap.;

    Abstract : Reducing the size of a neural network whilst maintaining a comparable performance is an important problem to be solved since the constrictions on resources of small devices make it impossible to deploy large models in numerous real-life scenarios. A prominent example is autonomous driving, where computer vision tasks such as object detection and semantic segmentation need to be performed in real time by mobile devices. READ MORE