Write a Blog >>
Tue 10 Nov 2020 17:35 - 17:36 at Virtual room 1 - Performance / QoS

Executing software microbenchmarks, a form of small-scale performance tests predominantly used for libraries and frameworks, is a costly endeavor.
Full benchmark suites take up to multiple hours or days to execute, rendering frequent checks, e.g., as part of continuous integration (CI), infeasible.
However, altering benchmark configurations to reduce execution time without considering the impact on result quality can lead to benchmark results that are not representative of the software's true performance.

We propose the first technique to dynamically stop software microbenchmark executions when their results are sufficiently stable.
Our approach implements three statistical stoppage criteria and is capable of reducing Java Microbenchmark Harness (JMH) suite execution times by 48.4% to 86.0%.
At the same time it retains the same result quality for 78.8% to 87.6% of the benchmarks, compared to executing the suite for the default duration.

The proposed approach does not require developers to manually craft custom benchmark configurations;
instead, it provides automated mechanisms for dynamic reconfiguration.
Hence, making dynamic reconfiguration highly effective and efficient, potentially paving the way to inclusion of JMH microbenchmarks in CI.

Conference Day
Tue 10 Nov

Displayed time zone: (UTC) Coordinated Universal Time change

17:30 - 18:00
Automatically Identifying Performance Issue Reports with Heuristic Linguistic Patterns
Research Papers
Yutong ZhaoStevens Institute of Technology, USA, Lu XiaoStevens Institute of Technology, USA, Pouria BabveyStevens Institute of Technology, USA, Lei SunStevens Institute of Technology, USA, Sunny WongAnalytical Graphics, USA, Angel A. MartinezAnalytical Graphics, USA, Xiao WangStevens Institute of Technology, USA
Calm Energy Accounting for Multithreaded Java Applications
Research Papers
Timur BabakolSUNY Binghamton, USA, Anthony CaninoUniversity of Pennsylvania, USA, Khaled MahmoudSUNY Binghamton, USA, Rachit SaxenaSUNY Binghamton, USA, Yu David LiuSUNY Binghamton, USA
Dynamically Reconfiguring Software Microbenchmarks: Reducing Execution Time without Sacrificing Result Quality
Research Papers
Christoph LaaberUniversity of Zurich, Switzerland, Stefan WürstenUniversity of Zurich, Switzerland, Harald GallUniversity of Zurich, Switzerland, Philipp LeitnerChalmers University of Technology, Sweden / University of Gothenburg, Sweden
DOI Pre-print Media Attached
Investigating types and survivability of performance bugs in mobile apps
Journal First
Alejandro Mazuera-RozoUniversità della Svizzera italiana & Universidad de los Andes, Catia TrubianiGran Sasso Science Institute, Mario Linares-VásquezUniversidad de los Andes, Gabriele BavotaUSI Lugano, Switzerland
Testing Self-Adaptive Software with Probabilistic Guarantees on Performance MetricsACM SIGSOFT Distinguished Paper Award
Research Papers
Claudio MandrioliLund University, Sweden, Martina MaggioSaarland University, Germany / Lund University, Sweden
DOI Pre-print
Conversations on Performance / QoS
Paper Presentations
Alejandro Mazuera-RozoUniversità della Svizzera italiana & Universidad de los Andes, Christoph LaaberUniversity of Zurich, Switzerland, Claudio MandrioliLund University, Sweden, Timur BabakolSUNY Binghamton, USA, M: Mei NagappanUniversity of Waterloo