Essays about: "Domain Adaption"
Showing result 1 - 5 of 13 essays containing the words Domain Adaption.
-
1. KARTAL: Web Application Vulnerability Hunting Using Large Language Models : Novel method for detecting logical vulnerabilities in web applications with finetuned Large Language Models
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Broken Access Control is the most serious web application security risk as published by Open Worldwide Application Security Project (OWASP). This category has highly complex vulnerabilities such as Broken Object Level Authorization (BOLA) and Exposure of Sensitive Information. READ MORE
-
2. Semi-Supervised Domain Adaptation for Semantic Segmentation with Consistency Regularization : A learning framework under scarce dense labels
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Learning from unlabeled data is a topic of critical significance in machine learning, as the large datasets required to train ever-growing models are costly and impractical to annotate. Semi-Supervised Learning (SSL) methods aim to learn from a few labels and a large unlabeled dataset. READ MORE
-
3. Quantum Simulation of Quantum Effects in Sub-10-nm Transistor Technologies
University essay from Uppsala universitet/MaterialteoriAbstract : In this master thesis, a 2D device simulator run on a hybrid classical-quantum computer was developed. The simulator was developed to treat statistical quantum effects such as quantum tunneling and quantum confinement in nanoscale transistors. READ MORE
-
4. A Digital Test Bench for Pneumatic Brakes
University essay from KTH/SpårfordonAbstract : This master’s thesis covers the structuring and implementation of a digital testbench for the air brake system of freight trains. The test bench will serveto further improve the existing brake models at Transrail Sweden AB. Theseare used for the optimised calculation of train speed profiles by the DriverAdvisory System CATO. READ MORE
-
5. Bidirectional Encoder Representations from Transformers (BERT) for Question Answering in the Telecom Domain. : Adapting a BERT-like language model to the telecom domain using the ELECTRA pre-training approach
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : The Natural Language Processing (NLP) research area has seen notable advancements in recent years, one being the ELECTRA model which improves the sample efficiency of BERT pre-training by introducing a discriminative pre-training approach. Most publicly available language models are trained on general-domain datasets. READ MORE