Loop invariant generation has long been a challenging problem. Black-box learning has recently emerged as a promising method for inferring loop invariants. However, the performance depends heavily on the quality of collected examples. In many cases, only after tens or even hundreds of constraint queries, can a feasible invariant be successfully inferred.
To reduce the gigantic number of constraint queries and improve the performance of black-box learning, we introduce interval counterexamples into the learning framework. Each interval counterexample represents a set of counterexamples from constraint solvers. We propose three different generalization techniques to compute interval counterexamples. The existing decision tree algorithm is also improved to adapt interval counterexamples. We evaluate our techniques and report over $40%$ improvement on learning rounds and verification time over the state-of-the-art approach.
Thu 12 NovDisplayed time zone: (UTC) Coordinated Universal Time change
08:30 - 09:00
|Domain-Independent Interprocedural Program Analysis using Block-Abstraction Memoization|
Research PapersDOI Pre-print Media Attached
|Inherent Vacuity for GR(1) Specifications|
|Interval Counterexamples for Loop Invariant Learning|
|Modular Collaborative Program Analysis in OPAL|
Dominik Helm TU Darmstadt, Germany, Florian Kübler TU Darmstadt, Germany, Michael Reif TU Darmstadt, Germany, Michael Eichberg TU Darmstadt, Germany, Mira Mezini TU Darmstadt, GermanyDOI
|Past-Sensitive Pointer Analysis for Symbolic Execution|
David Trabish Tel Aviv University, Israel, Timotej Kapus Imperial College London, UK, Noam Rinetzky Tel Aviv University, Cristian Cadar Imperial College London, UKDOI Pre-print Media Attached
|Conversations on Analysis 4|