Essays about: "Coreference"

Showing result 1 - 5 of 9 essays containing the word Coreference.

  1. 1. Domain-specific knowledge graph construction from Swedish and English news articles

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Aleksandra Krupinska; [2023]
    Keywords : knowledge graph; information extraction; knowledge representation; Swedish; English;

    Abstract : In the current age of new textual information emerging constantly, there is a challenge related to processing and structuring it in some ways. Moreover, the information is often expressed in many different languages, but the discourse tends to be dominated by English, which may lead to overseeing important, specific knowledge in less well-resourced languages. READ MORE

  2. 2. Coreference Resolution for Swedish

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Lisa Vällfors; [2022]
    Keywords : Natural language processing; Information extraction; Machine learning; Random forests; Coreference resolution; Språkteknologi; informationsextraktion; maskininlärning; beslutsträdsinlärning; koreferenslösning;

    Abstract : This report explores possible avenues for developing coreference resolution methods for Swedish. Coreference resolution is an important topic within natural language processing, as it is used as a preprocessing step in various information extraction tasks. READ MORE

  3. 3. Prerequisites for Extracting Entity Relations from Swedish Texts

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Erik Lenas; [2020]
    Keywords : Machine Learning; Natural Language Processing; Relation Extraction; Named Entity Recognition; Coreference resolution; BERT; Maskininlärning; Natural Language Processing; Relationsextrahering; Named Entity Recognition; Coreference resolution; BERT;

    Abstract : Natural language processing (NLP) is a vibrant area of research with many practical applications today like sentiment analyses, text labeling, questioning an- swering, machine translation and automatic text summarizing. At the moment, research is mainly focused on the English language, although many other lan- guages are trying to catch up. READ MORE

  4. 4. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Evangelina Gogoulou; [2019]
    Keywords : conversational machine comprehension; question answering; transformers; self-attention; language modelling; samtalsmaskinförståelse; frågesvar; transformatorer; självuppmärksamhet; språkmodellering;

    Abstract : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). READ MORE

  5. 5. Neural Language Models with Explicit Coreference Decision

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Jenny Kunz; [2019]
    Keywords : Coreference; Reference; Entity; Language Models; LM; Neural Networks; RNN; Attention; Deep Learning;

    Abstract : Coreference is an important and frequent concept in any form of discourse, and Coreference Resolution (CR) a widely used task in Natural Language Understanding (NLU). In this thesis, we implement and explore two recent models that include the concept of coreference in Recurrent Neural Network (RNN)-based Language Models (LM). READ MORE