posted on 2015-03-18, 17:36authored byAndrés Omar Portillo-Domínguez, Miao Wang, John Murphy, Damien Magoni, Nick Mitchell, Peter F Sweeney, Erik Altman
Performance testing in distributed environments is challenging. Specifically, the identification of performance issues
and their root causes are time-consuming and complex tasks
which heavily rely on expertise. To simplify these tasks,
many researchers have been developing tools with built-in
expertise. However limitations exist in these tools, such as
managing huge volumes of distributed data, that prevent
their e cient usage for performance testing of highly dis-
tributed environments. To address these limitations, this
paper presents an adaptive framework to automate the us-
age of expert systems in performance testing. Our validation
assessed the accuracy of the framework and the time savings
that it brings to testers. The results proved the bene ts of
the framework by achieving a significant decrease in the time
invested in performance analysis and testing.
History
Publication
JAMAICA 2014 Proceedings of the 2014 Workshop on Joining AcadeMiA and Industry Contributions to Test Automation and Model-Based Testing;pp. 22-27