Essays about: "Självuppmärksamhet"

Found 4 essays containing the word Självuppmärksamhet.

  1. 1. Using Machine Learning to Optimize Near-Earth Object Sighting Data at the Golden Ears Observatory

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Laura Murphy; [2023]
    Keywords : Near-Earth Object Detection; Machine Learning; Deep Learning; Visual Transformers;

    Abstract : This research project focuses on improving Near-Earth Object (NEO) detection using advanced machine learning techniques, particularly Vision Transformers (ViTs). The study addresses challenges such as noise, limited data, and class imbalance. READ MORE

  2. 2. Evaluating Text Summarization Models on Resumes : Investigating the Quality of Generated Resume Summaries and their Suitability as Resume Introductions

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Amanda Krohn; [2023]
    Keywords : Natural language processing; Abstractive text summarization; Transformer architecture; Fine-tuning; Resumes; Språkteknologi; Abstrakt textsammanfattning; Transformer-arkitektur; Finjustering; CV;

    Abstract : This thesis aims to evaluate different abstractive text summarization models and techniques for summarizing resumes. It has two main objectives: investigate the models’ performance on resume summarization and assess the suitability of the generated summaries as resume introductions. READ MORE

  3. 3. Attention based Knowledge Tracing in a language learning setting

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Sebastiaan Vergunst; [2022]
    Keywords : Knowledge Tracing; Exercise Recommendation; Personalised Learning; Recurrent Neural Network; Attention; Self-Attention; Exercise Embedding; Kunskapsspårning; Övningsrekommendation; Personligt Anpassad Inlärning; Rekurrenta Neurala Nätverk; Uppmärksamhet; Självuppmärksamhet; Övningsembedding;

    Abstract : Knowledge Tracing aims to predict future performance of users of learning platforms based on historical data, by modeling their knowledge state. In this task, the target is a binary variable representing the correctness of the exercise, where an exercise is a word uttered by the user. READ MORE

  4. 4. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Evangelina Gogoulou; [2019]
    Keywords : conversational machine comprehension; question answering; transformers; self-attention; language modelling; samtalsmaskinförståelse; frågesvar; transformatorer; självuppmärksamhet; språkmodellering;

    Abstract : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). READ MORE