CSPA - Continuous Software Performance Assessment

This master project is not available anymore, as students are currently working on it.


Unit testing has grown into a software-engineering practice, which is for example through test-driven development a common technique. Testing in combination with continuous delivery forms a rigorous methodology to detect software failures as early as possible.

Goals of this master project

Adressing the gaps mentioned above, we envision tooling, which brings performance testing in form of software microbenchmarks (e.g., JMH for Java, and Go benchmarking) into the world of continuous delivery. The students are expected to design and implement a Jenkins plugin, which continuously (e.g., on every commit) tests for a software's performance. The plugin should take parameters which define the way tests are executed, and different test environments should be selectable. These environments should range from a bare-metal machine to virtualized environemnts with VMs and containers hosted on different cloud providers (e.g., Amazon EC2, Azure, and Google Compute Engine). Furthermore, test results are automatically evaluated for regressions, and outcomes are saved in a database which allows for historical performance-regression analysis.

Task description

The main tasks of the project are (depending on number of students, this list is adaptable):

  • Explore the current state of performance testing with software microbenchmarks
  • Familiarise with Jenkins plug-in development
  • Learn about Infrastructure-as-Code tooling (e.g., Chef, Puppet, Ansible, Docker) to set up performance testing environments
  • Learn about cloud instances (i.e., IaaS clouds from Amazon, Microsoft, and Google) and how to select the right one for performance-test execution
  • Design a solution that is able to continuosly execute performance tests on different environments
  • The solution should be able to execute performance tests in Java (JMH) and Go
  • Perform and (re)evaluate statistical tests for regression detection
  • Save performance-test and regression analysis results into a database
  • Allow the tooling for long-term historical performance analysis
  • Write a report summarizing the results of the master project, and present it to the assistant/professor


This project is also available as a master thesis in reduced size and slightly different focus.


  • [1] Leitner and Bezemer - An Exploratory Study of the State of Practice of Performance Testing in Java-Based Open Source ProjectsLink
  • [2] Laaber et al. - Performance Testing in the Cloud. How Bad is it Really?Link

Posted: 12.03.2018

Contact: Christoph Laaber