We are hiring new doctoral researchers, student research assistants, and tutors. Apply now!
2 papers accepted at FSE 2024!

Funding by DFG-IDEFIX

Articles in conference or workshop proceedings

  1. Dirk Beyer, Lars Grunske, Matthias Kettl, Marian Lingsch-Rosenfeld, and Moeketsi Raselimo. P3: A Dataset of Partial Program Patches. In Proc. MSR, 2024. ACM. doi:10.1145/3643991.3644889 Link to this entry Keyword(s): Partial Fix, Dataset, Mining Funding: DFG-IDEFIX Publisher's Version PDF Supplement
    Artifact(s)
    Abstract
    Identifying and fixing bugs in programs remains a challenge and is one of the most time-consuming tasks in software development. But even after a bug is identified, and a fix has been proposed by a developer or tool, it is not uncommon that the fix is incomplete and does not cover all possible inputs that trigger the bug. This can happen quite often and leads to re-opened issues and inefficiencies. In this paper, we introduce P3, a curated dataset composed of in- complete fixes. Each entry in the set contains a series of commits fixing the same underlying issue, where multiple of the intermediate commits are incomplete fixes. These are sourced from real-world open-source C projects. The selection process involves both auto- mated and manual stages. Initially, we employ heuristics to identify potential partial fixes from repositories, subsequently we validate them through meticulous manual inspection. This process ensures the accuracy and reliability of our curated dataset. We envision that the dataset will support researchers while investigating par- tial fixes in more detail, allowing them to develop new techniques to detect and fix them.
    BibTeX Entry
    @inproceedings{MSR24, author = {Dirk Beyer and Lars Grunske and Matthias Kettl and Marian Lingsch-Rosenfeld and Moeketsi Raselimo}, title = {P3: A Dataset of Partial Program Patches}, booktitle = {Proc.\ MSR}, pages = {}, year = {2024}, publisher = {ACM}, doi = {10.1145/3643991.3644889}, url = {https://gitlab.com/sosy-lab/research/data/partial-fix-dataset}, pdf = {}, abstract = {Identifying and fixing bugs in programs remains a challenge and is one of the most time-consuming tasks in software development. But even after a bug is identified, and a fix has been proposed by a developer or tool, it is not uncommon that the fix is incomplete and does not cover all possible inputs that trigger the bug. This can happen quite often and leads to re-opened issues and inefficiencies. In this paper, we introduce P3, a curated dataset composed of in- complete fixes. Each entry in the set contains a series of commits fixing the same underlying issue, where multiple of the intermediate commits are incomplete fixes. These are sourced from real-world open-source C projects. The selection process involves both auto- mated and manual stages. Initially, we employ heuristics to identify potential partial fixes from repositories, subsequently we validate them through meticulous manual inspection. This process ensures the accuracy and reliability of our curated dataset. We envision that the dataset will support researchers while investigating par- tial fixes in more detail, allowing them to develop new techniques to detect and fix them.}, keyword = {Partial Fix, Dataset, Mining}, annote = {}, artifact = {10.5281/zenodo.10319627}, funding = {DFG-IDEFIX}, }
  2. Paulína Ayaziová, Dirk Beyer, Marian Lingsch-Rosenfeld, Martin Spiessl, and Jan Strejček. Software Verification Witnesses 2.0. In Proc. SPIN, LNCS , 2024. Springer. Link to this entry Keyword(s): Software Model Checking, Cooperative Verification, Witness-Based Validation, CPAchecker Funding: DFG-CONVEY, DFG-IDEFIX PDF Presentation Supplement
    BibTeX Entry
    @inproceedings{SPIN24c, author = {Paulína Ayaziová and Dirk Beyer and Marian Lingsch-Rosenfeld and Martin Spiessl and Jan Strejček}, title = {Software Verification Witnesses 2.0}, booktitle = {Proc.\ SPIN}, pages = {}, year = {2024}, series = {LNCS~}, publisher = {Springer}, url = {https://gitlab.com/sosy-lab/benchmarking/sv-witnesses/}, pdf = {https://www.sosy-lab.org/research/pub/2024-SPIN.Software_Verification_Witnesses_2.0.pdf}, presentation = {https://www.sosy-lab.org/research/prs/2024-04-11_SPIN24_Software-Verification-Witnesses-2.0.pdf}, abstract = {}, keyword = {Software Model Checking, Cooperative Verification, Witness-Based Validation, CPAchecker}, annote = {}, artifact = {}, doinone = {Unpublished: Last checked: 2024-03-25}, funding = {DFG-CONVEY,DFG-IDEFIX}, }
  3. Dirk Beyer, Matthias Kettl, and Thomas Lemberger. Fault Localization on Verification Witnesses. In Proceedings of the 30th International Symposium on Model Checking Software (SPIN 2024, Luxembourg City, Luxembourg, April 10-11), LNCS, 2024. Springer. Link to this entry Keyword(s): Software Model Checking, Witness-Based Validation, CPAchecker Funding: DFG-CONVEY, DFG-IDEFIX, DFG-COOP PDF
    Artifact(s)
    Abstract
    When verifiers report an alarm, they export a violation witness (exchangeable counterexample) that helps validate the reachability of that alarm. Conventional wisdom says that this violation witness should be very precise: the ideal witness describes a single error path for the validator to check. But we claim that verifiers overshoot and produce large witnesses with information that makes validation unnecessarily difficult. To check our hypothesis, we reduce violation witnesses to that information that automated fault-localization approaches deem relevant for triggering the reported alarm in the program. We perform a large experimental evaluation on the witnesses produced in the International Competition on Software Verification (SV-COMP 2023). It shows that our reduction shrinks the witnesses considerably and enables the confirmation of verification results that were not confirmable before.
    BibTeX Entry
    @inproceedings{SPIN24a, author = {Dirk Beyer and Matthias Kettl and Thomas Lemberger}, title = {Fault Localization on Verification Witnesses}, booktitle = {Proceedings of the 30th International Symposium on Model Checking Software (SPIN~2024, Luxembourg City, Luxembourg, April 10-11)}, pages = {}, year = {2024}, series = {LNCS}, publisher = {Springer}, pdf = {https://sosy-lab.org/research/pub/2024-SPIN.Fault_Localization_on_Verification_Witnesses.pdf}, abstract = {When verifiers report an alarm, they export a violation witness (exchangeable counterexample) that helps validate the reachability of that alarm. Conventional wisdom says that this violation witness should be very precise: the ideal witness describes a single error path for the validator to check. But we claim that verifiers overshoot and produce large witnesses with information that makes validation unnecessarily difficult. To check our hypothesis, we reduce violation witnesses to that information that automated fault-localization approaches deem relevant for triggering the reported alarm in the program. We perform a large experimental evaluation on the witnesses produced in the International Competition on Software Verification (SV-COMP 2023). It shows that our reduction shrinks the witnesses considerably and enables the confirmation of verification results that were not confirmable before.}, keyword = {Software Model Checking, Witness-Based Validation, CPAchecker}, annote = {Nominated for best paper.<br> This work was also presented with a poster at the 46th International Conference on Software Engineering (ICSE 2024, Lisbon, Portugal, April 14-20): <a href="https://sosy-lab.org/research/pst/2024-03-05_ICSE24_Fault_Localization_on_Verification_Witnesses_Poster.pdf">Extended Abstract</a>.}, artifact = {10.5281/zenodo.10794627}, doinone = {TBD}, funding = {DFG-CONVEY,DFG-IDEFIX,DFG-COOP}, }
    Additional Infos
    Nominated for best paper.
    This work was also presented with a poster at the 46th International Conference on Software Engineering (ICSE 2024, Lisbon, Portugal, April 14-20): Extended Abstract.
  4. Daniel Baier, Dirk Beyer, Po-Chun Chien, Marek Jankola, Matthias Kettl, Nian-Ze Lee, Thomas Lemberger, Marian Lingsch-Rosenfeld, Martin Spiessl, Henrik Wachowitz, and Philipp Wendler. CPAchecker 2.3 with Strategy Selection (Competition Contribution). In Proc. TACAS (3), LNCS 14572, pages 359-364, 2024. Springer. doi:10.1007/978-3-031-57256-2_21 Link to this entry Keyword(s): Software Model Checking, Witness-Based Validation, CPAchecker Funding: DFG-CONVEY, DFG-IDEFIX Publisher's Version PDF Supplement
    Artifact(s)
    Abstract
    CPAchecker is a versatile framework for software verification, rooted in the established concept of configurable program analysis. Compared to the last published system description at SV-COMP 2015, the CPAchecker submission to SV-COMP 2024 incorporates new analyses for reachability safety, memory safety, termination, overflows, and data races. To combine forces of the available analyses in CPAchecker and cover the full spectrum of the diverse program characteristics and specifications in the competition, we use strategy selection to predict a sequential portfolio of analyses that is suitable for a given verification task. The prediction is guided by a set of carefully picked program features. The sequential portfolios are composed based on expert knowledge and consist of bit-precise analyses using k-induction, data-flow analysis, SMT solving, Craig interpolation, lazy abstraction, and block-abstraction memoization. The synergy of various algorithms in CPAchecker enables support for all properties and categories of C programs in SV-COMP 2024 and contributes to its success in many categories. CPAchecker also generates verification witnesses in the new YAML format.
    BibTeX Entry
    @inproceedings{TACAS24c, author = {Daniel Baier and Dirk Beyer and Po-Chun Chien and Marek Jankola and Matthias Kettl and Nian-Ze Lee and Thomas Lemberger and Marian Lingsch-Rosenfeld and Martin Spiessl and Henrik Wachowitz and Philipp Wendler}, title = {{CPAchecker} 2.3 with Strategy Selection (Competition Contribution)}, booktitle = {Proc.\ TACAS~(3)}, pages = {359-364}, year = {2024}, series = {LNCS~14572}, publisher = {Springer}, doi = {10.1007/978-3-031-57256-2_21}, url = {https://cpachecker.sosy-lab.org/}, pdf = {https://www.sosy-lab.org/research/pub/2024-TACAS.CPAchecker_2.3_with_Strategy_Selection_Competition_Contribution.pdf}, abstract = {CPAchecker is a versatile framework for software verification, rooted in the established concept of configurable program analysis. Compared to the last published system description at SV-COMP 2015, the CPAchecker submission to SV-COMP 2024 incorporates new analyses for reachability safety, memory safety, termination, overflows, and data races. To combine forces of the available analyses in CPAchecker and cover the full spectrum of the diverse program characteristics and specifications in the competition, we use strategy selection to predict a sequential portfolio of analyses that is suitable for a given verification task. The prediction is guided by a set of carefully picked program features. The sequential portfolios are composed based on expert knowledge and consist of bit-precise analyses using <i>k</i>-induction, data-flow analysis, SMT solving, Craig interpolation, lazy abstraction, and block-abstraction memoization. The synergy of various algorithms in CPAchecker enables support for all properties and categories of C programs in SV-COMP 2024 and contributes to its success in many categories. CPAchecker also generates verification witnesses in the new YAML format.}, keyword = {Software Model Checking, Witness-Based Validation, CPAchecker}, artifact = {10.5281/zenodo.10203297}, funding = {DFG-CONVEY, DFG-IDEFIX}, }

Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All person copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Last modified: Sat Apr 27 01:04:39 2024 UTC