Post-processing of optical character recognition for Swedish addresses

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: ​​Optical character recognition (Optical Character Recognition (OCR)) has many applications, such as digitizing historical documents, automating processes, and helping visually impaired people read. However, extracting text from images into a digital format is not an easy problem to solve, and the outputs from the OCR frameworks often include errors. The complexity comes from the many variations in (digital) fonts, handwriting, lighting, etc. To tackle this problem, this thesis investigates two different methods for correcting the errors in OCR output. The used dataset consists of Swedish addresses. The methods are therefore applied to postal automation to investigate the usage of these methods for further automating postal work by automatically reading addresses on parcels using OCR. The main method, the lexical implementation, uses a dataset of Swedish addresses so that any valid address should be in this dataset (hence there is a known and limited vocabulary), and misspelled addresses are corrected to the address in the lexicon with the smallest Levenshtein distance. The second approach is to use the same dataset, but with artificial errors, or artificial noise, added. The addresses with this artificial noise are then used together with their correct spelling to train a machine learning model based on Neural machine translation (Neural Machine Translation (NMT)) to automatically correct errors in OCR read addresses. The results from this study could contribute by defining in what direction future work connected to OCR and postal addresses should go. The results were that the lexical implementation outperformed the NMT model. However, more experiments including real data would be required to draw definitive conclusions as to how the methods would work in real-life applications.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)