Advanced search
Showing result 1 - 5 of 24 essays matching the above criteria.
-
1. Understanding the Robustnessof Self Supervised Representations
University essay from Luleå tekniska universitet/Institutionen för system- och rymdteknikAbstract : This work investigates the robustness of learned representations of self-supervised learn-ing approaches, focusing on distribution shifts in computer vision. Joint embedding architecture and method-based self-supervised learning approaches have shown advancesin learning representations in a label-free manner and efficient knowledge transfer towardreducing human annotation needs. READ MORE
-
2. Knowledge distillation for anomaly detection
University essay from Uppsala universitet/Institutionen för informationsteknologiAbstract : The implementation of systems and methodologies for time series anomaly detection holds the potential of providing timely detection of faults and issues in a wide variety of technical systems. Ideally, these systems are able to identify deviations from the normal behavior of systems even before any problems manifest, thus enabling proactive maintenance. READ MORE
-
3. Efficient Sentiment Analysis and Topic Modeling in NLP using Knowledge Distillation and Transfer Learning
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : This abstract presents a study in which knowledge distillation techniques were applied to a Large Language Model (LLM) to create smaller, more efficient models without sacrificing performance. Three configurations of the RoBERTa model were selected as ”student” models to gain knowledge from a pre-trained ”teacher” model. READ MORE
-
4. Enhancing Neural Network Accuracy on Long-Tailed Datasets through Curriculum Learning and Data Sorting
University essay from KTH/Matematik (Avd.)Abstract : In this paper, a study is conducted to investigate the use of Curriculum Learning as an approach to address accuracy issues in a neural network caused by training on a Long-Tailed dataset. The thesis problem is presented by a Swedish e-commerce company. READ MORE
-
5. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. READ MORE