Compression of Large-Scale Aerial Imagery : Exploring Set Redundancy Methods

University essay from Högskolan i Gävle/Datavetenskap

Abstract: Compression of data has been historically always important; more data is gettingproduced and therefore has to be stored. While hardware technology advances,compression should be a must to reduce storage occupied and to keep the data intransmission as small as possible. Set redundancy has been developed in 1996 but has since then not received a lot ofattention in research. This paper tries to implement two set redundancy methods –the Max-Min-Predictive II and also the Intensity Mapping algorithm to see if thismethod could be used on large scale aerial imagery in the geodata field. After using the set redundancy methods, different individual image compressionmethods were applied and compared to the standard JPEG2000 in lossless mode.These compression algorithms were Huffman, LZW, and JPEG2000 itself. The data sets used were two images each taken from 2019, one pair with 60% overlap,the other with 80% overlap. Individual compression of images is still offering abetter compression ratio, but the set redundancy method produces results which areworth investigating further with more images in a set of similar images. This points to future work of compressing a larger set with more overlap and moreimages, which for greater potential matching should be overlaid more carefully toensure matching pixel values.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)