Conditional Testing

Conditional Testing

Off-the-Shelf Combination of Test-Case Generators

Dirk Beyer and Thomas Lemberger



There are several powerful automatic testers available, each with different strengths and weaknesses. To immediately benefit from different strengths of different tools, we need to investigate ways for quick and easy combination of techniques. Until now, research has mostly investigated integrated combinations, which require extra implementation effort.

We propose the concept of conditional testing and a set of combination techniques that do not require implementation effort: Different testers can be taken ‘off the shelf’ and combined in a way that they cooperatively solve the problem of test-case generation for a given input program and coverage criterion. This way, the latest advances in test-case generation can be combined without delay.

Conditional testing passes the test goals that a first tester has covered to the next tester, so that the next tester does not need to repeat work (as in combinations without information passing) but can focus on the remaining test goals. Our combinations do not require changes to the implementation of a tester, because we leverage a testability transformation (i.e., we reduce the input program to those parts that are relevant to the remaining test goals).

To evaluate conditional testing and our proposed combination techniques, we (1) implemented the generic conditional tester CondTest, including the required transformations, and (2) ran experiments on a large amount of benchmark tasks; the obtained results are promising.


The replication artifact contains all experimental data and tools required to replicate that data. See the README within the artifact for further information.

Experimental Data

(to top)

The experimental data was produced on machines with:

Each experiment run was limited to:

We used all 1720 tasks of the Cover-Branches category of Test-Comp, available in the sv-benchmarks repository in revision testcomp19. For conditional software testing, we used CondTest in version 2.0; for witness-to-test (used in vb), we used CPAchecker in revision bebb23; and for test-case generation, we used all tools in their respective Test-Comp versions. To run the experiments in a reproducible manner, we used benchexec in version 1.20.


Configuration HTML Raw results (generation) Raw results (coverage measurement)
CoVeriTest (Test-Comp) HTML XML XML
CPA-tiger (Test-Comp) HTML XML XML
Klee (Test-Comp) HTML XML XML
CoVeriTest (CondTest) HTML XML XML
CPA-tiger (CondTest) HTML XML XML
Klee (CondTest) HTML XML XML
Testers without information exchange (id) HTML XML XML
Testers with information exchange (prune) HTML XML XML
Testers + ESBMC (vb) HTML XML XML