Essays about: "redundancy in neural networks"

Showing result 1 - 5 of 9 essays containing the words redundancy in neural networks.

  1. 1. Visual Attention Guided Adaptive Quantization for x265 using Deep Learning

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Mikaela Gärde; [2023]
    Keywords : video encoding; deep learning; visual attention; adaptive quantization; videokodning; djupinlärning; visuellt fokus; adaptiv kvantisering;

    Abstract : The video on demand streaming is raising drastically in popularity, bringing new challenges to the video coding field. There is a need for new video coding techniques that improve performance and reduce the bitrates. READ MORE

  2. 2. More efficient training using equivariant neural networks

    University essay from Uppsala universitet/Avdelningen Vi3

    Author : Karl Bylander; [2023]
    Keywords : convolutional neural networks; equivariance; equivariant neural networks; transmission electron microscopy; machine learning;

    Abstract : Convolutional neural networks are equivariant to translations; equivariance to other symmetries, however, is not defined and the class output may vary depending on the input's orientation. To mitigate this, the training data can be augmented at the cost of increased redundancy in the model. READ MORE

  3. 3. Using Reinforcement Learning to Correct Soft Errors of Deep Neural Networks

    University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Author : Yuhang Li; [2023]
    Keywords : DNN; Soft errors; Redundancy; DRL; DQN; Transfer learning; Training time; DNN; Mjuka fel; Redundans; DRL; DQN; Överföringsinlärning; Utbildningstid;

    Abstract : Deep Neural Networks (DNNs) are becoming increasingly important in various aspects of human life, particularly in safety-critical areas such as autonomous driving and aerospace systems. However, soft errors including bit-flips can significantly impact the performance of these systems, leading to serious consequences. READ MORE

  4. 4. Distillation or loss of information? : The effects of distillation on model redundancy

    University essay from Uppsala universitet/Institutionen för lingvistik och filologi

    Author : Eva Elzbieta Sventickaite; [2022]
    Keywords : distillation; distillation effects; distilbert; distilmbert; distilroberta; distilgpt-2; distilled neurons; redundancy; redundancy in neural networks; redundancy in language models; neuron reduction in language models; distilled language models;

    Abstract :     The necessity for billions of parameters in large language models has lately been questioned as there are still unanswered questions regarding how information is captured in the networks. It could be argued that without this knowledge, there may be a tendency to overparametarize the models. READ MORE

  5. 5. Non-Destructive Biomass and Relative Growth Rate Estimation in Aeroponic Agriculture using Machine Learning

    University essay from Lunds universitet/Matematik LTH

    Author : Oskar Åström; [2022]
    Keywords : Machine Learning; Image Analysis; Aeroponics; Hydroculture; Relative Growth Rate; Multi-variate Regression; Neural Network; ResNet; Plant Growth; Plant Physiology; Technology and Engineering;

    Abstract : Optimising plant growth in a controlled climate requires good measurements of both biomass (measured in grams) and relative growth rate (measured in grams of growth per day and gram of plant). In order to do this efficiently and continuously on an individual level during plant development, this has to be done non-destructively and without frequent and labor intensive weighing of plant biomass. READ MORE