Essays about: "generative pre-trained models"
Showing result 1 - 5 of 17 essays containing the words generative pre-trained models.
-
1. An In-Depth study on the Utilization of Large Language Models for Test Case Generation
University essay from Umeå universitet/Institutionen för datavetenskapAbstract : This study investigates the utilization of Large Language Models for Test Case Generation. The study uses the Large Language model and Embedding model provided by Llama, specifically Llama2 of size 7B, to generate test cases given a defined input. READ MORE
-
2. Round-Trip Translation : A New Path for Automatic Program Repair using Large Language Models
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Research shows that grammatical mistakes in a sentence can be corrected by machine translating it to another language and back. We investigate whether this correction capability of Large Language Models (LLMs) extends to Automatic Program Repair (APR), a software engineering task. READ MORE
-
3. Artificial Intelligence-driven web development and agile project management using OpenAI API and GPT technology : A detailed report on technical integration and implementation of GPT models in CMS with API and agile web development for quality user-centered AI chat service experience
University essay from Mittuniversitetet/Institutionen för data- och elektroteknik (2023-)Abstract : This graduation report explores the integration of Artificial Intelligence (AI) tools, specifically OpenAI's Generative Pre-trained Transformer (GPT) technology, into web development processes using WordPress (WP) for developing a AI-driven chat service. The focus of the project is on ImagineX AB, a private company that offers the educational service ChatGPT Utbildning aimed at teaching professionals to effectively utilize ChatGPT. READ MORE
-
4. Predicting the Unpredictable – Using Language Models to Assess Literary Quality
University essay from Uppsala universitet/Institutionen för lingvistik och filologiAbstract : People read for various purposes like learning specific skills, acquiring foreign languages, and enjoying the pure reading experience, etc. This kind of pure enjoyment may credit to many aspects, such as the aesthetics of languages, the beauty of rhyme, and the entertainment of being surprised by what will happen next, the last of which is typically featured in fictional narratives and is also the main topic of this project. READ MORE
-
5. Exploring GPT models as biomedical knowledge bases : By evaluating prompt methods for extracting information from language models pre-trained on scientific articles
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Scientific findings recorded in literature help continuously guide scientific advancements, but manual approaches to accessing that knowledge are insufficient due to the sheer quantity of information and data available. Although pre-trained language models are being explored for their utility as knowledge bases and structured data repositories, there is a lack of research for this application in the biomedical domain. READ MORE