Applying Similarity Condition Embedding Network to an industrial fashion dataset

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Viktor Törnegren; [2020]

Keywords: ;

Abstract: To create a fashionable outfit one needs to take into account several different similarity conditions between clothing items, such as season, colour, what kind of context the outfit is supposed worn in etc. This is of course a hard task for a human to do but an even harder task for a computer to solve. To make an algorithm take into account different similarity conditions from images Veit, Belongie, and Karaletsos [1] and Vasileva et al. [2] introduced two different models that utilizes predefined similarity conditions. Tan et al. [3] took inspiration from [1, 2] and created an algorithm that learns the similairty conditions in an unsupervised way and they tested their model on a dataset containing outfits created by regular people. In this thesiswe present a newfashion datset that has been created with the help of fashion experts from Hennes & Mauritz AB. We provide evidence that our reimplementation of the Similarity Condition Embedding Network (SCE-net) from [3] can pick out garments that complete an outfit as well as evaluate if the clothing items in an outfit are compatible or not on data that contains outifts for both men and women. We also show that the SCE-net can be trained on outfits for one gender and then predict on a dataset containing clothes for another gender. We further provide results that our network generalize well to unseen categories by training it on outfits without accessories and then test the network on outfits with accessories. In addition we also introduce a dataset that contains baskets of items for customers from Hennes & Mauritz online shop as well as their boutiques. On this data we provide evidence that our reimplementation of the SCE-net can predict the next item in a customers shopping basket.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)