Cargando…
PerfCI: A Toolchain for Automated Performance Testing during Continuous Integration of Python Projects
Software performance testing is an essential quality assurance mechanism that can identify optimization opportunities. Automating this process requires strong tool support, especially in the case of Continuous Integration (CI) where tests need to run completely automatically and it is desirable to p...
Autores principales: | , , , , , , |
---|---|
Publicado: |
2020
|
Acceso en línea: | https://dx.doi.org/10.1145/3324884.3415288 http://cds.cern.ch/record/2799941 |
Sumario: | Software performance testing is an essential quality assurance mechanism that can identify optimization opportunities. Automating this process requires strong tool support, especially in the case of Continuous Integration (CI) where tests need to run completely automatically and it is desirable to provide developers with actionable feedback. A lack of existing tools means that performance testing is normally left out of the scope of CI. In this paper, we propose a toolchain - PerfCI - to pave the way for developers to easily set up and carry out automated performance testing under CI. Our toolchain is based on allowing users to (1) specify performance testing tasks, (2) analyze unit tests on a variety of python projects ranging from scripts to full-blown flask-based web services, by extending a performance analysis framework (VyPR) and (3) evaluate performance data to get feedback on the code. We demonstrate the feasibility of our toolchain by using it on a web service running at the Compact Muon Solenoid (CMS) experiment at the world's largest particle physics laboratory --- CERN. Package. Source code, example and documentation of PerfCI are available: https://gitlab.cern.ch/omjaved/perfci. Tool demonstration can be viewed on YouTube: https://youtu.be/RDmXMKA1v7g. We also provide the data set used in the analysis: https://gitlab.cern.ch/omjaved/perfci-dataset. |
---|