On the acceptance by code reviewers of candidate security patches suggested by Automated Program Repair tools

Aurora Papotti*, Ranindya Paramitha, Fabio Massacci

*Corresponding author for this work

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

Objective: We investigated whether (possibly wrong) security patches suggested by Automated Program Repairs (APR) for real world projects are recognized by human reviewers. We also investigated whether knowing that a patch was produced by an allegedly specialized tool does change the decision of human reviewers. Method: We perform an experiment with n=72 Master students in Computer Science. In the first phase, using a balanced design, we propose to human reviewers a combination of patches proposed by APR tools for different vulnerabilities and ask reviewers to adopt or reject the proposed patches. In the second phase, we tell participants that some of the proposed patches were generated by security-specialized tools (even if the tool was actually a ‘normal’ APR tool) and measure whether the human reviewers would change their decision to adopt or reject a patch. Results: It is easier to identify wrong patches than correct patches, and correct patches are not confused with partially correct patches. Also patches from APR Security tools are adopted more often than patches suggested by generic APR tools but there is not enough evidence to verify if ‘bogus’ security claims are distinguishable from ‘true security’ claims. Finally, the number of switches to the patches suggested by security tool is significantly higher after the security information is revealed irrespective of correctness. Limitations: The experiment was conducted in an academic setting, and focused on a limited sample of popular APR tools and popular vulnerability types.

Original languageEnglish
Article number132
Pages (from-to)1-35
Number of pages35
JournalEmpirical Software Engineering
Volume29
Issue number5
Early online date3 Aug 2024
DOIs
Publication statusPublished - Sept 2024

Bibliographical note

Publisher Copyright:
© The Author(s) 2024.

Funding

European Union H2020 Grant 952647 (AssureMOSS) and Horizon Europe Grant 101120393 (Sec4AI4Sec), Dutch Research Council (NWO) Grant NWA.1215.18.006 (THESEUS)and Grant KIC1.VE01.20.004 (HEWSTI). This work has been partly supported by the European Union H2020 Program under the Grant 952647 (AssureMOSS) and the Horizon Europe Program under grant (Sec4AI4Sec) and the Dutch Research Council (NWO) under the grant NWA.1215.18.006 (Theseus) and grant KIC1.VE01.20.004 (HEWSTI). We would like to thank Quang-Cuong Bui, Duc-Ly Vu, and Riccardo Scandariato for many useful discussions on APR tools and their assessment and \u00C1kos Mil\u00E1nkovich for providing the test Visual Studio plug-ins we have used in the pilot. We would also like to thank the reviewers of the registered report and the full version of the paper as their comments greatly helped to improve the quality of the paper.

FundersFunder number
European Union H2020
HEWSTI
European Commission952647
Nederlandse Organisatie voor Wetenschappelijk OnderzoekKIC1.VE01.20.004, NWA.1215.18.006
HORIZON EUROPE Framework Programme101120393

    Keywords

    • Automated program repair
    • code review
    • patch adoption
    • security patches

    Fingerprint

    Dive into the research topics of 'On the acceptance by code reviewers of candidate security patches suggested by Automated Program Repair tools'. Together they form a unique fingerprint.

    Cite this