Dataset Open Access
Waller,Jan; Fittkau,Florian; Hasselbring,Wilhelm
Monitoring of a software system provides insights into its runtime behavior, improving system analysis and comprehension. System-level monitoring approaches focus, e.g., on network monitoring, providing information on externally visible system behavior. Application-level performance monitoring frameworks, such as Kieker or Dapper, allow to observe the internal application behavior, but introduce runtime overhead depending on the number of instrumentation probes.
We report on how we were able to significantly reduce the runtime overhead of the Kieker monitoring framework. For achieving this optimization, we employed micro-benchmarks with a structured performance engineering approach. During optimization, we kept track of the impact on maintainability of the framework. In this paper, we discuss the emerged trade-off between performance and maintainability in this context.
To the best of our knowledge, publications on monitoring frameworks provide none or only weak performance evaluations, making comparisons cumbersome. However, our micro-benchmark, presented in this paper, provides a basis for such comparisons. Our experiment code and data are available as open source software such that interested researchers may repeat or extend our experiments for comparison on other hardware platforms or with other monitoring frameworks.
This dataset supplements the paper and contains the raw experimental data as well as several generated diagrams for each experiment.
|PT2014-data.zip.001 md5:9281b0f6e1fcd24dad5fc7cc88a10899||1.1 GB||Download|
|PT2014-data.zip.002 md5:7b6892b4bb11984274a791e0ac165f5d||1.1 GB||Download|
|PT2014-data.zip.003 md5:ea6d687de7836ba63376b956dabdad71||1.1 GB||Download|
|PT2014-data.zip.004 md5:b059ae309941e130b554c4c491fce51d||1.1 GB||Download|
|PT2014-data.zip.005 md5:161d9bca14613a2a39bb75fea49c412d||1.1 GB||Download|
|PT2014-data.zip.006 md5:d01c5f2c30a8780dd6b77f1b15d85da6||368.6 MB||Download|