University essay from Mälardalens universitet/Akademin för innovation, design och teknik

Abstract: Since mid to late 2010 image synthesizing using neural networks has become a trending research topic. And the framework mostly used for solving these tasks is the Generative adversarial network (GAN). GAN works by using two networks, a generator and a discriminator that trains and competes alongside each other. In today’s research regarding image synthesis, it is mostly about generating or altering images in any way which could be used in many fields, for example creating virtual environments. The topic is however still in quite an early stage of its development and there are fields where image synthesizing using Generative adversarial networks fails. In this work, we will answer one thesis question regarding the limitations and discuss for example the limitation causing GAN networks to get stuck during training. In addition to some limitations with existing GAN models, the research also lacks more experimental GAN variants. It exists today a lot of different variants, where GAN has been further developed and modified. But when it comes to GAN models where the discriminator has been changed to a different network, the number of existing works reduces drastically. In this work, we will experiment and compare an existing deep convolutional generative adversarial network (DCGAN), which is a GAN variant, with one that we have modified using a deep neuro-fuzzy system. We have created the first DCGAN model that uses a deep neuro-fuzzy system as a discriminator. When comparing these models, we concluded that the performance differences are not big. But we strongly believe that with some further improvements our model can outperform the DCGAN model. This work will therefore contribute to the research with the result and knowledge of a possible improvement to DCGAN models which in the future might cause similar research to be conducted on other GANmodels. 

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)