An evaluation of tools for verifying non-functional requirements for cloud deployed applications.

University essay from Umeå universitet/Institutionen för datavetenskap

Author: Jonas Ernerstedt; [2023]

Keywords: ;

Abstract: Evaluating non-functional requirements is a crucial part of distributed systems. As cloud deployed systems are continuously developed, the quality of the system must be verified for every iteration. To measure the non-functional requirements performance and latency, the open source load testing tools k6, Locust and Taurus are evaluated and compared. The criteria for the comparison are, throughput, execution time, average response time, the granularity for time measurement and if the tool can be extended to use custom protocols. Two experiments were designed, the first was running a static load of 500000 requests against a remote RESTful API where run time, throughput and latency was recorded, this was executed on three different levels of hardware. The second experiment attempts to find the maximum throughput from a single node by increasing the amount of virtual users. In all experiments, k6 performed the best, having a high throughput even when running on a system with low computing power and still perform best with high computing power. Locust performed well with high computing power, but was still ahead of Taurus with low computing power. Taurus did not perform better than the other tools in any of the experiments. The best performing and most well-rounded tool according to the results is k6 however, Locust can also be considered when hardware is not the limiting factor due to its flexible nature since it is running in Python and as such is more easily modified. 

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)