Advanced search

Showing result 1 - 5 of 19 essays matching the above criteria.

  1. 1. A Study in Describing Complex Words Using Wikipedia's Categorisation System : Adding Descriptive Terms to Increase the Comprehension of Swedish Texts

    University essay from Linköpings universitet/Institutionen för datavetenskap

    Author : Sebastian Ragnarsson; [2023]
    Keywords : epithet; wikipedia; nlp; complex words; prototype theori;

    Abstract : This thesis offers new input in the field of generating epithets to aid the comprehension of Swedish texts. For whatever reason, a reader might find certain words in a text difficult to understand. READ MORE

  2. 2. Exploring the Correlation Between Reading Ability and Mathematical Ability : KTH Master thesis report

    University essay from KTH/Lärande

    Author : Richard Sol; Alexander Rasch; [2023]
    Keywords : Reading ability; Mathematical ability; Model optimization; Eye-tracking; Correlation between mathematics and reading; Machine-learning models; Reading Fluency; Reading comprehension; Formative assessment; Ordinal Regression; Spearman’s correlation coefficient; Grid search.; Läsförmåga; Matematisk förmåga; Modelloptimering; Ögonspårning; Korrelation mellan matematik och läsning; Maskininlärningsmodeller; Läsflyt; Läsförståelse; Formativ bedömning; Ordinal regression; Spearmans korrelationskoefficient; Rutnätssökning.;

    Abstract : Reading and mathematics are two essential subjects for academic success and cognitive development. Several studies show a correlation between the reading ability and mathematical ability of pupils (Korpershoek et al., 2015; Ní Ríordáin & O’Donoghue, 2009; Reikerås, 2006; Walker et al., 2008). READ MORE

  3. 3. Few-shot Question Generation with Prompt-based Learning

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Yongchao Wu; [2022]
    Keywords : Nautral Lanugage Processing; Question Generation; Neural Networks; Prompt-based Learning; Language Models;

    Abstract : Question generation (QG), which automatically generates good-quality questions from a piece of text, is capable of lowering the cost of the manual composition of questions. Recently Question generation has attracted increasing interest for its ability to supply a large number of questions for developing conversation systems and educational applications, as well as corpus development for natural language processing (NLP) research tasks, such as question answering and reading comprehension. READ MORE

  4. 4. Zero-shot, One Kill: BERT for Neural Information Retrieval

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Stergios Efes; [2021]
    Keywords : neural information retrieval; passage ranking; weak supervision; question answering; passage reranking; BERT; transfer-learning in IR; zero-shot IR; passage-retrieval; BERT for passage-retrieval; MS Marco; information retrieval; neural IR;

    Abstract : [Background]: The advent of bidirectional encoder representation from trans- formers (BERT) language models (Devlin et al., 2018) and MS Marco, a large scale human-annotated dataset for machine reading comprehension (Bajaj et al., 2016) that made publicly available, led the field of information retrieval (IR) to experience a revolution (Lin et al. READ MORE

  5. 5. Large-Context Question Answering with Cross-Lingual Transfer

    University essay from Uppsala universitet/Institutionen för informationsteknologi

    Author : Markus Sagen; [2021]
    Keywords : Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Abstract : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. READ MORE