Proceedings paper

Automated Benchmarking and Analysis Tool
T. Kalibera, J. Lehotsky, D. Majda, B. Repcek, M. Tomcanyi, A. Tomecek, P. Tůma, J. Urban
Proc. 1st International Conference on Performance Evaluation Methodolgies and Tools (VALUETOOLS)

Benchmarking is an important performance evaluation technique that provides performance data representative of real systems. Such data can be used to verify the results of performance modeling and simulation, or to detect performance changes. Automated benchmarking is an increasingly popular approach to tracking performance changes during software development, which gives developers a timely feedback on their work. In contrast with the advances in modeling and simulation tools, the tools for automated benchmarking are usually being implemented ad-hoc for each project, wasting resources and limiting functionality.We present the result of project BEEN, a generic tool for automated benchmarking in a heterogeneous distributed environment. BEEN automates all steps of a benchmark experiment from software building and deployment through measurement and load monitoring to the evaluation of results. The notable features include separation of measurement from the evaluation and ability to adaptively scale the benchmark experiment based on the evaluation. BEEN has been designed to facilitate automated detection of performance changes during software development (regression benchmarking).

    title = {{Automated Benchmarking and Analysis Tool}},
    author = {Kalibera, Tomas and Lehotsky, Jakub and Majda, David and Repcek, Branislav and Tomcanyi, Michal and Tomecek, Antonin and Tuma, Petr and Urban, Jaroslav},
    year = {2006},
    booktitle = {{Proc. 1st International Conference on Performance Evaluation Methodolgies and Tools (VALUETOOLS)}},
    publisher = {ACM},
    location = {New York, NY, USA},
    doi = {10.1145/1190095.1190101},
    isbn = {978-1-59593-504-5},
    url = {},