Essays about: "Språkmodell"

Showing result 1 - 5 of 38 essays containing the word Språkmodell.

  1. 1. Topological regularization and relative latent representations

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Alejandro García Castellanos; [2023]
    Keywords : Algebraic Topology; Large Language Models; Relative Representation; Representation Learning; Model Stitching; Topological DataAnalysis; Zero-shot; Algebraisk topologi; Stora språkmodeller; Relativ representation; Representationsinlärning; Modell sömmar; Topologisk dataanalys; Zero-shot;

    Abstract : This Master's Thesis delves into the application of topological regularization techniques and relative latent representations within the realm of zero-shot model stitching. Building upon the prior work of Moschella et al. READ MORE

  2. 2. Domain Knowledge and Representation Learning for Centroid Initialization in Text Clustering with k-Means : An exploratory study

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : David Yu; [2023]
    Keywords : Natural language processing; Sentiment analysis; Clustering; Language model; Transformer; Heuristic; Språkteknologi; Sentimentanalys; Klustering; Språkmodell; Transformer; Heuristik;

    Abstract : Text clustering is a problem where texts are partitioned into homogeneous clusters, such as partitioning them based on their sentiment value. Two techniques to address the problem are representation learning, in particular language representation models, and clustering algorithms. READ MORE

  3. 3. Language Models as Evaluators : A Novel Framework for Automatic Evaluation of News Article Summaries

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Celine Helgesson Hallström; [2023]
    Keywords : Natural Language Processing; Large Language Models; Automatic Text Evaluation; Text Summarization; Multilingualism; Naturlig Språkbehandling; Stora Språkmodeller; Automatisk Textutvärdering; Textsammanfattning; Flerspråkighet;

    Abstract : The advancements in abstractive summarization using Large Language Models (LLMs) have brought with it new challenges in evaluating the quality and faithfulness of generated summaries. This thesis explores a human-like automated method for evaluating news article summaries. READ MORE

  4. 4. Efficient Sentiment Analysis and Topic Modeling in NLP using Knowledge Distillation and Transfer Learning

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : George Malki; [2023]
    Keywords : Large Language Model; RoBERTa; Knowledge distillation; Transfer learning; Sentiment analysis; Topic modeling; Stor språkmodell; RoBERTa; Kunskapsdestillation; överföringsinlärning; Sentimentanalys; Ämnesmodellering;

    Abstract : This abstract presents a study in which knowledge distillation techniques were applied to a Large Language Model (LLM) to create smaller, more efficient models without sacrificing performance. Three configurations of the RoBERTa model were selected as ”student” models to gain knowledge from a pre-trained ”teacher” model. READ MORE

  5. 5. Contextual short-term memory for LLM-based chatbot

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Mikael Lauri Aleksi Törnwall; [2023]
    Keywords : Chatbot; Artificial Intelligence; Machine Learning; Language Model; Large Language Model; GPT-3; Natural Language Processing; Text Summarization; Dialogue Summarization; Prompt Design; Prompt Programming; Chatbot; Artificiell Intelligens; Maskininlärning; Språkmodell; Stor Språkmodell; GPT-3; Naturlig Ppråkbehandling; Textsammanfattning; Sammanfattning av Dialog; Design för Inmatningsprompt; Inmatningsprompt Programmering;

    Abstract : The evolution of Language Models (LMs) has enabled building chatbot systems that are capable of human-like dialogues without the need for fine-tuning the chatbot for a specific task. LMs are stateless, which means that a LM-based chatbot does not have a recollection of the past conversation unless it is explicitly included in the input prompt. READ MORE