Testing data logging tools in DataOps for Digital Twins

University essay from Karlstads universitet/Institutionen för matematik och datavetenskap (from 2013)

Author: Adrian Bakken Sundmoen; [2023]

Keywords: ;

Abstract: Industry 4.0 and a global trend in digital transformation have brought new ideas and emerging technologies to the surface. Data has become a key asset for businesses, and streamlining data and automating data life cycles have become increasingly important. This industrial revolution is centered around cyber-physical systems, and it sets forth that new technologies will change how a business traditionally operates. However, the problem is a lack of tools, systems, and methods to realize this revolution. Thus, there is a strong demand for finding solutions that move businesses toward Industry 4.0. A new technology known as Digital Twin (DT) has emerged from this. This technology aims to improve the business value of big data by digitally representing physical entities. To operate successfully with this technology, other enabling technologies and tools are needed, providing DTs with high-quality data that accurately represent the system in which the twin models are used. This can be a problem as the data might originate from different sources and often do not follow the same format and standards. Furthermore, data must also be readily collected in a timely manner. To deal with problems such as these, a new term known as Data Operations (DataOps) has surfaced. DataOps is a set of practices and processes that aims to improve the communication, integration, and automation of data flow within data landscapes and organizations. This thesis introduces a methodology to investigate whether a standardized data logging tool can be used as a DataOps solution to collect, process, and make data available for DTs. This is done by investigating the current literature and applying testing methodologies to the tool. More specifically, a combination of load, performance, and stress tests are performed to assess the ability of the tool to collect large amounts of data. The focus is on investigating whether this can be done in a timely manner. It is concluded that the tool does posses features that are of importance for DataOps and DTs, and that it could be a viable option for data gathering to certain DTs on its own. However; as a result of internal mechanics of the tool, it is not timely enough for use as a DataOps solution in general. Further research regarding improvements of its timeliness, other similar tools, and testing in a real environment consisting of a real DT is proposed and motivated.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)