Essays about: "mBERT"
Showing result 1 - 5 of 12 essays containing the word mBERT.
-
1. Monolingual and Cross-Lingual Survey Response Annotation
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : Multilingual natural language processing (NLP) is increasingly recognized for its potential in processing diverse text-type data, including those from social media, reviews, and technical reports. Multilingual language models like mBERT and XLM-RoBERTa (XLM-R) play a pivotal role in multilingual NLP. READ MORE
-
2. BERTie Bott’s Every Flavor Labels : A Tasty Guide to Developing a Semantic Role Labeling Model for Galician
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : For the vast majority of languages, Natural Language Processing (NLP) tools are either absent entirely, or leave much to be desired in their final performance. Despite having nearly 4 million speakers, one such low-resource language is Galician. READ MORE
-
3. Cross-Lingual and Genre-Supervised Parsing and Tagging for Low-Resource Spoken Data
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : Dealing with low-resource languages is a challenging task, because of the absence of sufficient data to train machine-learning models to make predictions on these languages. One way to deal with this problem is to use data from higher-resource languages, which enables the transfer of learning from these languages to the low-resource target ones. READ MORE
-
4. Multilingual Transformer Models for Maltese Named Entity Recognition
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : The recently developed state-of-the-art models for Named Entity Recognition are heavily dependent upon huge amounts of available annotated data. Consequently, it is extremely challenging for data-scarce languages to obtain significant result. READ MORE
-
5. Analysis of Syntactic Behaviour of Neural Network Models by Using Gradient-Based Saliency Method : Comparative Study of Chinese and English BERT, Multilingual BERT and RoBERTa
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : Neural network models such as Transformer-based BERT, mBERT and RoBERTa are achieving impressive performance (Devlin et al., 2019; Lewis et al., 2020; Liu et al., 2019; Raffel et al. READ MORE