Generating accurate assert statements for unit test cases using pretrained transformers Article Swipe
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.1145/3524481.3527220
· OA: W3086938529
Unit testing represents the foundational basis of the software testing\npyramid, beneath integration and end-to-end testing. Automated software testing\nresearchers have proposed a variety of techniques to assist developers in this\ntime-consuming task. In this paper we present an approach to support developers\nin writing unit test cases by generating accurate and useful assert statements.\nOur approach is based on a state-of-the-art transformer model initially\npretrained on an English textual corpus. This semantically rich model is then\ntrained in a semi-supervised fashion on a large corpus of source code. Finally,\nwe finetune this model on the task of generating assert statements for unit\ntests. The resulting model is able to generate accurate assert statements for a\ngiven method under test. In our empirical evaluation, the model was able to\npredict the exact assert statements written by developers in 62% of the cases\nin the first attempt. The results show 80% relative improvement for top-1\naccuracy over the previous RNN-based approach in the literature. We also show\nthe substantial impact of the pretraining process on the performances of our\nmodel, as well as comparing it with assert auto-completion task. Finally, we\ndemonstrate how our approach can be used to augment EvoSuite test cases, with\nadditional asserts leading to improved test coverage.\n