Segmentation of the Common Carotid Artery from Ultrasound Images using UNet
Abstract: The common carotid artery (CCA), the artery that supplies our brains with oxygen, is of great importance in stroke research. The usual way to monitor the artery is by using ultrasound (US) imaging. An automated way of segmenting out the arteries from the US images is desired to optimize research. With an upswing in development of convolutional neural networks (CNN), this is now possible. CNN uses a training dataset to tweak a set of trainable parameters so that it can successfully classify images that is similar to the training data, and can then successfully segment out the CCA in new images. However, the US images tend to have a large variation from patient to patient, since the artery and, the tissue and veins around the artery are not identical from patient to patient. A normal way that convolutional neural networks tackle this is by using a large training dataset to learn the variation, sadly a large dataset of out-segmented CCAs does not exist. This presents some challenges to using CNN to segment CCAs. In this thesis, a case study has been performed with the fully convolutional neural network architecture called UNet, trained with less than 200 US images of the CCA to study if the network can segment the CCA in new images. With training data of around 200 US images, the network's output is compared to an expert's segmentation as the ground truth. The conclusion is that the network show promising result with an average of a 0.871 DICE similarity coefficient. Two key limitations were identified, first that images with a large difference in greyscale from the training data need to be preprocessed before, and secondly that artifacts need to be reduced to get a good segmentation. The study does promote the use of UNet for segmenting the CCA in US images. With further development in postprocessing, a reliable way to segment the CCA in US images is possible.
AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)