Essays about: "Latent Space"
Showing result 11 - 15 of 74 essays containing the words Latent Space.
-
11. Learning the shapes of protein pockets
University essay from Göteborgs universitet/Institutionen för data- och informationsteknikAbstract : The comparison of protein pockets plays an important role in drug discovery. Through the identification of binding sites with similar structures, we can assist in finding hits and characterizing the function of proteins. Traditionally, the geometry of cavities has been described with scalar features, which are not fully representative of the shape. READ MORE
-
12. Distance preserving Fermat VAE
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Deep neural networks takes their strength in the representations, or features, that they internally build. While these internal encodings help networks performing classification or regression tasks on specific data types, it exists a branch of machine learning that has for only purpose to build these representations. READ MORE
-
13. Towards topology-aware Variational Auto-Encoders : from InvMap-VAE to Witness Simplicial VAE
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Variational Auto-Encoders (VAEs) are one of the most famous deep generative models. After showing that standard VAEs may not preserve the topology, that is the shape of the data, between the input and the latent space, we tried to modify them so that the topology is preserved. READ MORE
-
14. Representation learning for single cell morphological phenotyping
University essay from Umeå universitet/Institutionen för fysikAbstract : Preclinical research for developing new drugs is a long and expensive procedure. Experiments relying on image acquisition and analysis tend to be low throughput and use reporter systems that may influence the studied cells. READ MORE
-
15. Attribute Embedding for Variational Auto-Encoders : Regularization derived from triplet loss
University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)Abstract : Techniques for imposing a structure on the latent space of neural networks have seen much development in recent years. Clustering techniques used for classification have been used to great success, and with this work we hope to bridge the gap between contrastive losses and Generative models. READ MORE