One of the challenges of modern cloud application development is the management of deployment costs. Automated and semi-automated scaling, pay-per-use pricing, complex multi-factor billing models, and frequent changes in the market make predicting the actual costs of using a given (combination of) cloud services to host a given application so that it exhibits a given level of minimal performance (e.g., response time) difficult for developers. The primary contribution of MInCA will be an approach to model, predict and optimize the deployment costs of cloud applications, with a specific focus on (micro-)service-based applications. Central to our approach is the notion of a holistic cost model (HCM), which is a multi-view architectural model of the application that integrates modelling the dependencies between services, the deployment of the individual services onto cloud resources,and the application workload (e.g., what load is on what service at what times). We will use the HCM to foster developer awareness of the cost impact of their code changes (e.g., through appropriate visualizations in the Integrated Development Environment, IDE), to support What-If analysis ("What happens if the application gets 50% more users?"), and to enable application-wide deployment optimization (i.e., select optimal deployment options for each service depending on global state and available cloud services).
Duration: September 2016 - November 2019
Funding: SNF (Total Costs: 182.963 CHF)
- Giovanni Grano, Christoph Laaber, Annibale Panichella, and Sebastiano Panichella. Testing with Fewer Resources: An Adaptive Approach to Performance- Aware Test Case Generation. IEEE Transactions on Software Engineering. 2019. [Preprint]
- Christoph Laaber. Continuous Software Performance Assessment: Detecting Performance Problems of Software Libraries on Every Build. 28th ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA ’19 - DocSym). 2019. [Preprint (PDF, 495 KB)]
- Christoph Laaber, Joel Scheuner, and Philipp Leitner. Software Microbenchmarking in the Cloud. How Bad is it Really?. Empirical Software Engineering - An International Journal. 2019. [Preprint (PDF, 1 MB)]
- Christoph Laaber and Philipp Leitner. An Evaluation of Open-Source Software Microbenchmark Suites for Continuous Performance Assessment. 15th International Conference on Mining Software Repositories (MSR'18). [Preprint (PDF, 1 MB)]
- Jürgen Cito , Gerald Schermann , Erik Wittern, Philipp Leitner, Sali Zumberi, and Harald C. Gall. 2017. An empirical analysis of the Docker container ecosystem on GitHub. 14th International Conference on Mining Software Repositories (MSR'17).
- Christian Davatz, Christian Inzinger, Joel Scheuner, and Philipp Leitner. 2017. An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications. 17th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid).
- Christoph Laaber and Philipp Leitner. 2017. (h|g)opper: Performance History Mining and Analysis. 7th ACM/SPEC International Conference on Performance Engineering (ICPE - Demo Track). [Preprint (PDF, 492 KB)]
- Philipp Leitner and Cor-Paul Bezemer. 2017. An Exploratory Study of the State of Practice of Performance Testing in Java-Based Open Source Projects. 7th ACM/SPEC International Conference on Performance Engineering (ICPE). PDF (preprint)
- Philipp Leitner, Jürgen Cito, and Emanuel Stöckli. 2016. Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC). PDF (preprint)