We are hiring new doctoral researchers, student research assistants, and tutors. Apply now!
2 papers accepted at FSE 2024!

Publications about Software Testing

Articles in journal or book chapters

  1. Dirk Beyer. First International Competition on Software Testing. International Journal on Software Tools for Technology Transfer (STTT), 23(6):833-846, 2021. doi:10.1007/s10009-021-00613-3 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Funding: DFG-COOP Publisher's Version PDF Supplement
    BibTeX Entry
    @article{TestComp19-STTT, author = {Dirk Beyer}, title = {First International Competition on Software Testing}, journal = {International Journal on Software Tools for Technology Transfer (STTT)}, volume = {23}, number = {6}, pages = {833-846}, year = {2021}, doi = {10.1007/s10009-021-00613-3}, sha256 = {cd82a853fbbf65de7f95a9e7de4f36118bb35fb516db87421a0aa38ccc863031}, url = {https://www.sosy-lab.org/research/pub/2021-STTT.First_International_Competition_on_Software_Testing.pdf}, pdf = {}, presentation = {}, abstract = {}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, funding = {DFG-COOP}, issn = {1433-2787}, }
  2. Dirk Beyer and Marie-Christine Jakobs. Cooperative Verifier-Based Testing with CoVeriTest. International Journal on Software Tools for Technology Transfer (STTT), 23(3):313-333, 2021. doi:10.1007/s10009-020-00587-8 Link to this entry Keyword(s): CPAchecker, Software Model Checking, Software Testing Funding: DFG-COOP Publisher's Version PDF
    Abstract
    Testing is a widely applied technique to evaluate software quality, and coverage criteria are often used to assess the adequacy of a generated test suite. However, manually constructing an adequate test suite is typically too expensive, and numerous techniques for automatic test-suite generation were proposed. All of them come with different strengths. To build stronger test-generation tools, different techniques should be combined. In this paper, we study cooperative combinations of verification approaches for test generation, which exchange high-level information. We present CoVeriTest, a hybrid technique for test-suite generation. CoVeriTest iteratively applies different conditional model checkers and allows users to adjust the level of cooperation and to configure individual time limits for each conditional model checker. In our experiments, we systematically study different CoVeriTest cooperation setups, which either use combinations of explicit-state model checking and predicate abstraction, or bounded model checking and symbolic execution. A comparison with state-of-the-art test-generation tools reveals that CoVeriTest achieves higher coverage for many programs (about 15
    BibTeX Entry
    @article{CoVeriTest-STTT, author = {Dirk Beyer and Marie-Christine Jakobs}, title = {Cooperative Verifier-Based Testing with {CoVeriTest}}, journal = {International Journal on Software Tools for Technology Transfer (STTT)}, volume = {23}, number = {3}, pages = {313-333}, year = {2021}, doi = {10.1007/s10009-020-00587-8}, sha256 = {28a5bf6103296455728076e8c12902a53b3d377a296ea2ba18ac111c93330dbd}, url = {}, pdf = {}, presentation = {}, abstract = {Testing is a widely applied technique to evaluate software quality, and coverage criteria are often used to assess the adequacy of a generated test suite. However, manually constructing an adequate test suite is typically too expensive, and numerous techniques for automatic test-suite generation were proposed. All of them come with different strengths. To build stronger test-generation tools, different techniques should be combined. In this paper, we study cooperative combinations of verification approaches for test generation, which exchange high-level information. We present CoVeriTest, a hybrid technique for test-suite generation. CoVeriTest iteratively applies different conditional model checkers and allows users to adjust the level of cooperation and to configure individual time limits for each conditional model checker. In our experiments, we systematically study different CoVeriTest cooperation setups, which either use combinations of explicit-state model checking and predicate abstraction, or bounded model checking and symbolic execution. A comparison with state-of-the-art test-generation tools reveals that CoVeriTest achieves higher coverage for many programs (about 15%).}, keyword = {CPAchecker,Software Model Checking,Software Testing}, funding = {DFG-COOP}, issn = {1433-2787}, }
  3. Thomas Lemberger. Plain random test generation with PRTest. International Journal on Software Tools for Technology Transfer (STTT), 2020. Springer. doi:10.1007/s10009-020-00568-x Link to this entry Keyword(s): Software Testing Publisher's Version PDF Presentation
    Abstract
    Automatic test-suite generation tools are often complex and their behavior is not predictable. To provide a minimum baseline that test-suite generators should be able to surpass, we present PRTest, a random black-box test-suite generator for C programs: To create a test, PRTest natively executes the program under test and creates a new, random test value whenever an input value is required. After execution, PRTest checks whether any new program branches were covered and, if this is the case, the created test is added to the test suite. This way, tests are rapidly created either until a crash is found, or until the user aborts the creation. While this naive mechanism is not competitive with more sophisticated, state-of-the-art test-suite generation tools, it is able to provide a good baseline for Test-Comp and a fast alternative for automatic test-suite generation for programs with simple control flow. PRTest is publicly available and open source.
    BibTeX Entry
    @article{PRTEST19, author = {Thomas Lemberger}, title = {Plain random test generation with {PRTest}}, journal = {International Journal on Software Tools for Technology Transfer (STTT)}, volume = {}, number = {}, pages = {}, year = {2020}, publisher = {Springer}, doi = {10.1007/s10009-020-00568-x}, sha256 = {2e5ae7091b6adb758c123dfe62d3fab57203f930883539beb20f6e91391ebc77}, pdf = {https://www.sosy-lab.org/research/pub/2020-STTT.Plain_random_test_generation_with_PRTest.pdf}, presentation = {https://www.sosy-lab.org/research/prs/2019-04-06_TestComp19_PRTest_Thomas.pdf}, abstract = {Automatic test-suite generation tools are often complex and their behavior is not predictable. To provide a minimum baseline that test-suite generators should be able to surpass, we present PRTest, a random black-box test-suite generator for C programs: To create a test, PRTest natively executes the program under test and creates a new, random test value whenever an input value is required. After execution, PRTest checks whether any new program branches were covered and, if this is the case, the created test is added to the test suite. This way, tests are rapidly created either until a crash is found, or until the user aborts the creation. While this naive mechanism is not competitive with more sophisticated, state-of-the-art test-suite generation tools, it is able to provide a good baseline for Test-Comp and a fast alternative for automatic test-suite generation for programs with simple control flow. PRTest is publicly available and open source.}, keyword = {Software Testing}, annote = {Publication appeared first online in July 2020.<BR/> PRTest is available at <a href="https://gitlab.com/sosy-lab/software/prtest"> https://gitlab.com/sosy-lab/software/prtest</a>}, }
    Additional Infos
    Publication appeared first online in July 2020.
    PRTest is available at https://gitlab.com/sosy-lab/software/prtest

Articles in conference or workshop proceedings

  1. Dirk Beyer. Software Testing: 5th Comparative Evaluation: Test-Comp 2023. In L. Lambers and S. Uchitel, editors, Proceedings of the 26th International Conference on Fundamental Approaches to Software Engineering (FASE 2023, Paris, France, April 22-27), LNCS 13991, pages 309-323, 2023. Springer. doi:10.1007/978-3-031-30826-0_17 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Funding: DFG-COOP Publisher's Version PDF Supplement
    BibTeX Entry
    @inproceedings{FASE23, author = {Dirk Beyer}, title = {Software Testing: 5th Comparative Evaluation: {Test-Comp 2023}}, booktitle = {Proceedings of the 26th International Conference on Fundamental Approaches to Software Engineering (FASE~2023, Paris, France, April 22-27)}, editor = {L. Lambers and S. Uchitel}, pages = {309--323}, year = {2023}, series = {LNCS~13991}, publisher = {Springer}, isbn = {}, doi = {10.1007/978-3-031-30826-0_17}, sha256 = {7110c26bf3c9311f84346a108a59318687bdadde4879f83d047f1a0fc546b630}, url = {https://test-comp.sosy-lab.org/2023/}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, _pdf = {https://www.sosy-lab.org/research/pub/2023-FASE.Software_Testing_5th_Comparative_Evaluation_Test-Comp_2023.pdf}, funding = {DFG-COOP}, }
  2. Dirk Beyer. Advances in Automatic Software Testing: Test-Comp 2022. In E. B. Johnsen and M. Wimmer, editors, Proceedings of the 25th International Conference on Fundamental Approaches to Software Engineering (FASE 2022, Munich, Germany, April 2-7), LNCS 13241, pages 321-335, 2022. Springer. doi:10.1007/978-3-030-99429-7_18 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Funding: DFG-COOP Publisher's Version PDF Supplement
    BibTeX Entry
    @inproceedings{FASE22b, author = {Dirk Beyer}, title = {Advances in Automatic Software Testing: {Test-Comp 2022}}, booktitle = {Proceedings of the 25th International Conference on Fundamental Approaches to Software Engineering (FASE~2022, Munich, Germany, April 2-7)}, editor = {E.~B.~Johnsen and M.~Wimmer}, pages = {321-335}, year = {2022}, series = {LNCS~13241}, publisher = {Springer}, isbn = {}, doi = {10.1007/978-3-030-99429-7_18}, sha256 = {3f921c8f232a5c970f678889de8c402313049522a5dfa69ca68cd01d9dd9fce3}, url = {https://test-comp.sosy-lab.org/2022/}, abstract = {}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, _pdf = {https://www.sosy-lab.org/research/pub/2022-FASE.Advances_in_Automatic_Software_Testing_Test-Comp_2022.pdf}, funding = {DFG-COOP}, }
  3. Dirk Beyer. Status Report on Software Testing: Test-Comp 2021. In E. Guerra and M. Stoelinga, editors, Proceedings of the 24th International Conference on Fundamental Approaches to Software Engineering (FASE 2021, Luxembourg, Luxembourg, March 27 - April 1), LNCS 12649, pages 341-357, 2021. Springer. doi:10.1007/978-3-030-71500-7_17 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Funding: DFG-COOP Publisher's Version PDF Supplement
    Abstract
    This report describes Test-Comp 2021, the 3rd edition of the Competition on Software Testing. The competition is a series of annual comparative evaluations of fully automatic software test generators for C programs. The competition has a strong focus on reproducibility of its results and its main goal is to provide an overview of the current state of the art in the area of automatic test-generation. The competition was based on 3 173 test-generation tasks for C programs. Each test-generation task consisted of a program and a test specification (error coverage, branch coverage). Test-Comp 2021 had 11 participating test generators from 6 countries.
    BibTeX Entry
    @inproceedings{FASE21, author = {Dirk Beyer}, title = {Status Report on Software Testing: {Test-Comp 2021}}, booktitle = {Proceedings of the 24th International Conference on Fundamental Approaches to Software Engineering (FASE~2021, Luxembourg, Luxembourg, March 27 - April 1)}, editor = {E.~Guerra and M.~Stoelinga}, pages = {341-357}, year = {2021}, series = {LNCS~12649}, publisher = {Springer}, isbn = {978-3-030-71500-7}, doi = {10.1007/978-3-030-71500-7_17}, sha256 = {113b44c5be9f6d773ebd1a5cad91e8dc66f06d7af0b8c648c9dcea8d6bbc7e3d}, url = {https://test-comp.sosy-lab.org/2021/}, abstract = {This report describes Test-Comp 2021, the 3rd edition of the Competition on Software Testing. The competition is a series of annual comparative evaluations of fully automatic software test generators for C programs. The competition has a strong focus on reproducibility of its results and its main goal is to provide an overview of the current state of the art in the area of automatic test-generation. The competition was based on 3 173 test-generation tasks for C programs. Each test-generation task consisted of a program and a test specification (error coverage, branch coverage). Test-Comp 2021 had 11 participating test generators from 6 countries.}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, funding = {DFG-COOP}, }
  4. Dirk Beyer. Second Competition on Software Testing: Test-Comp 2020. In Proceedings of the 23rd International Conference on Fundamental Approaches to Software Engineering (FASE 2020, Dublin, Ireland, April 25-30), LNCS 12076, pages 505-519, 2020. Springer. doi:10.1007/978-3-030-45234-6_25 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Publisher's Version PDF Supplement
    Artifact(s)
    BibTeX Entry
    @inproceedings{FASE20, author = {Dirk Beyer}, title = {Second Competition on Software Testing: {Test-Comp 2020}}, booktitle = {Proceedings of the 23rd International Conference on Fundamental Approaches to Software Engineering (FASE~2020, Dublin, Ireland, April 25-30)}, pages = {505-519}, year = {2020}, series = {LNCS~12076}, publisher = {Springer}, doi = {10.1007/978-3-030-45234-6_25}, sha256 = {296b4caf885ae029e388c2ef8fd032f1ab55c07d5e8ea1064f2e50c08f5d6919}, url = {https://test-comp.sosy-lab.org/2020/}, abstract = {}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, artifact1 = {10.5281/zenodo.3678250}, artifact2 = {10.5281/zenodo.3678264}, artifact3 = {10.5281/zenodo.3678275}, artifact4 = {10.5281/zenodo.3574420}, }
  5. Dirk Beyer and Marie-Christine Jakobs. Cooperative Test-Case Generation with Verifiers. In M. Felderer, W. Hasselbring, R. Rabiser, and R. Jung, editors, Proceedings of the Conference on Software Engineering (SE 2020, Innsbruck, Austria, February 24-28), LNI P-300, pages 107-108, 2020. GI. doi:10.18420/SE2020_31 Link to this entry Keyword(s): CPAchecker, Software Model Checking, Software Testing Publisher's Version
    BibTeX Entry
    @inproceedings{SE20, author = {Dirk Beyer and Marie-Christine Jakobs}, title = {Cooperative Test-Case Generation with Verifiers}, booktitle = {Proceedings of the Conference on Software Engineering (SE~2020, Innsbruck, Austria, February 24-28)}, editor = {M.~Felderer and W.~Hasselbring and R.~Rabiser and R.~Jung}, pages = {107--108}, year = {2020}, series = {{LNI}~P-300}, publisher = {{GI}}, doi = {10.18420/SE2020_31}, sha256 = {}, pdf = {}, presentation = {}, abstract = {}, keyword = {CPAchecker,Software Model Checking,Software Testing}, annote = {This is a summary of a <a href="https://www.sosy-lab.org/research/bib/Year/2019.html#FASE19">full article on this topic</a> that appeared in Proc. FASE 2019.}, isbnnote = {978-3-88579-694-7}, }
    Additional Infos
    This is a summary of a full article on this topic that appeared in Proc. FASE 2019.
  6. Dirk Beyer and Thomas Lemberger. TestCov: Robust Test-Suite Execution and Coverage Measurement. In Proceedings of the 34th IEEE/ACM International Conference on Automated Software Engineering (ASE 2019, San Diego, CA, USA, November 11-15), pages 1074-1077, 2019. IEEE. doi:10.1109/ASE.2019.00105 Link to this entry Keyword(s): Software Testing Funding: DFG-COOP Publisher's Version PDF Presentation
    BibTeX Entry
    @inproceedings{ASE19, author = {Dirk Beyer and Thomas Lemberger}, title = {{T}est{C}ov: Robust Test-Suite Execution and Coverage Measurement}, booktitle = {Proceedings of the 34th IEEE/ACM International Conference on Automated Software Engineering (ASE 2019, San Diego, CA, USA, November 11-15)}, pages = {1074-1077}, year = {2019}, publisher = {IEEE}, doi = {10.1109/ASE.2019.00105}, sha256 = {}, pdf = {https://www.sosy-lab.org/research/pub/2019-ASE.TestCov_Robust_Test-Suite_Execution_and_Coverage_Measurement.pdf}, presentation = {https://www.sosy-lab.org/research/prs/2019-11-12_ASE19_TestCov_Thomas_Lemberger.pdf}, keyword = {Software Testing}, funding = {DFG-COOP}, isbnnote = {978-1-7281-2508-4}, }
  7. Dirk Beyer and Thomas Lemberger. Conditional Testing - Off-the-Shelf Combination of Test-Case Generators. In Yu-Fang Chen, Chih-Hong Cheng, and Javier Esparza, editors, Proceedings of the 17th International Symposium on Automated Technology for Verification and Analysis (ATVA 2019, Taipei, Taiwan, October 28-31), LNCS 11781, pages 189-208, 2019. Springer. doi:10.1007/978-3-030-31784-3_11 Link to this entry Keyword(s): Software Testing Funding: DFG-COOP Publisher's Version PDF Presentation Supplement
    BibTeX Entry
    @inproceedings{ATVA19, author = {Dirk Beyer and Thomas Lemberger}, title = {Conditional Testing - Off-the-Shelf Combination of Test-Case Generators}, booktitle = {Proceedings of the 17th International Symposium on Automated Technology for Verification and Analysis (ATVA~2019, Taipei, Taiwan, October 28-31)}, editor = {Yu{-}Fang Chen and Chih{-}Hong Cheng and Javier Esparza}, pages = {189-208}, year = {2019}, series = {LNCS~11781}, publisher = {Springer}, doi = {10.1007/978-3-030-31784-3_11}, sha256 = {}, url = {https://www.sosy-lab.org/research/conditional-testing/}, pdf = {https://www.sosy-lab.org/research/pub/2019-ATVA.Conditional_Testing_Off-the-Shelf_Combination_of_Test-Case_Generators.pdf}, presentation = {https://www.sosy-lab.org/research/prs/2019-10-29_ATVA19_Conditional_Testing_Thomas_Lemberger.pdf}, keyword = {Software Testing}, funding = {DFG-COOP}, }
  8. Dirk Beyer. International Competition on Software Testing (Test-Comp). In Proceedings of the 25th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS 2019, Prague, Czech Republic, April 6-11), part 3, LNCS 11429, pages 167-175, 2019. Springer. doi:10.1007/978-3-030-17502-3_11 Link to this entry Keyword(s): Competition on Software Testing (Test-Comp), Competition on Software Testing (Test-Comp Report), Software Testing Publisher's Version PDF Supplement
    BibTeX Entry
    @inproceedings{TACAS19c, author = {Dirk Beyer}, title = {International Competition on Software Testing (Test-Comp)}, booktitle = {Proceedings of the 25th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS~2019, Prague, Czech Republic, April 6-11), part 3}, pages = {167-175}, year = {2019}, series = {LNCS~11429}, publisher = {Springer}, doi = {10.1007/978-3-030-17502-3_11}, sha256 = {80ba1d656e40b44c40e756010ccd32db5aad71820cd746b264f70244477fc737}, url = {https://test-comp.sosy-lab.org/2019/}, keyword = {Competition on Software Testing (Test-Comp),Competition on Software Testing (Test-Comp Report),Software Testing}, }
  9. Dirk Beyer and Marie-Christine Jakobs. CoVeriTest: Cooperative Verifier-Based Testing. In Proceedings of the 22nd International Conference on Fundamental Approaches to Software Engineering (FASE 2019, Prague, Czech Republic, April 6-11), LNCS 11424, pages 389-408, 2019. Springer. doi:10.1007/978-3-030-16722-6_23 Link to this entry Keyword(s): CPAchecker, Software Model Checking, Software Testing Publisher's Version PDF Supplement
    BibTeX Entry
    @inproceedings{FASE19, author = {Dirk Beyer and Marie-Christine Jakobs}, title = {CoVeriTest: Cooperative Verifier-Based Testing}, booktitle = {Proceedings of the 22nd International Conference on Fundamental Approaches to Software Engineering (FASE~2019, Prague, Czech Republic, April 6-11)}, pages = {389-408}, year = {2019}, series = {LNCS~11424}, publisher = {Springer}, doi = {10.1007/978-3-030-16722-6_23}, sha256 = {ee64749fba4796ed79cecfaa500731ef2ac5d5e795770c44b1e7ad358f955398}, url = {https://www.sosy-lab.org/research/coop-testgen/}, keyword = {CPAchecker,Software Model Checking,Software Testing}, }
  10. Johannes Bürdek, Malte Lochau, Stefan Bauregger, Andreas Holzer, Alexander von Rhein, Sven Apel, and Dirk Beyer. Facilitating Reuse in Multi-Goal Test-Suite Generation for Software Product Lines. In A. Egyed and I. Schaefer, editors, Proceedings of the 18th International Conference on Fundamental Approaches to Software Engineering (FASE 2015, London, UK, April 13-15), LNCS 9033, pages 84-99, 2015. Springer-Verlag, Heidelberg. doi:10.1007/978-3-662-46675-9_6 Link to this entry Keyword(s): CPAchecker, Software Model Checking, Software Testing Publisher's Version PDF Supplement
    BibTeX Entry
    @inproceedings{FASE15, author = {Johannes B{\"u}rdek and Malte Lochau and Stefan Bauregger and Andreas Holzer and Alexander von Rhein and Sven Apel and Dirk Beyer}, title = {Facilitating Reuse in Multi-Goal Test-Suite Generation for Software Product Lines}, booktitle = {Proceedings of the 18th International Conference on Fundamental Approaches to Software Engineering (FASE~2015, London, UK, April 13-15)}, editor = {A.~Egyed and I.~Schaefer}, pages = {84-99}, year = {2015}, series = {LNCS~9033}, publisher = {Springer-Verlag, Heidelberg}, isbn = {978-3-662-46674-2}, doi = {10.1007/978-3-662-46675-9_6}, sha256 = {fcd4d2f3155e3e061318a444f578c41c5e224a7c76e1bf161fe55cc7ae01ae86}, url = {http://forsyte.at/software/cpatiger/}, keyword = {CPAchecker,Software Model Checking,Software Testing}, }

Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All person copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Last modified: Sat Apr 20 01:04:53 2024 UTC