Mobile App Crowdsourced Test Report Consistency Detection via Deep Image-and-Text Fusion Understanding Article Swipe
YOU?
·
· 2023
· Open Access
·
· DOI: https://doi.org/10.1109/tse.2023.3285787
Crowdsourced testing, as a distinct testing paradigm, has attracted much\nattention in software testing, especially in mobile application (app) testing\nfield. Compared with in-house testing, crowdsourced testing shows superiority\nwith the diverse testing environments when faced with the mobile testing\nfragmentation problem. However, crowdsourced testing also encounters the\nlow-quality test report problem caused by unprofessional crowdworkers involved\nwith different expertise. In order to handle the submitted reports of uneven\nquality, app developers have to distinguish high-quality reports from\nlow-quality ones to help the bug inspection. One kind of typical low-quality\ntest report is inconsistent test reports, which means the textual descriptions\nare not focusing on the attached bug-occurring screenshots. According to our\nempirical survey, only 18.07% crowdsourced test reports are consistent.\nInconsistent reports cause waste on mobile app testing.\n To solve the inconsistency problem, we propose ReCoDe to detect the\nconsistency of crowdsourced test reports via deep image-and-text fusion\nunderstanding. ReCoDe is a two-stage approach that first classifies the reports\nbased on textual descriptions into different categories according to the bug\nfeature. In the second stage, ReCoDe has a deep understanding of the GUI image\nfeatures of the app screenshots and then applies different strategies to handle\ndifferent types of bugs to detect the consistency of the crowdsourced test\nreports. We conduct an experiment on a dataset with over 22k test reports to\nevaluate ReCoDe, and the results show the effectiveness of ReCoDe in detecting\nthe consistency of crowdsourced test reports. Besides, a user study is\nconducted to prove the practical value of ReCoDe in effectively helping app\ndevelopers improve the efficiency of reviewing the crowdsourced test reports.\n
Related Topics To Compare & Contrast
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1109/tse.2023.3285787
- OA Status
- green
- Cited By
- 12
- References
- 68
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4380434595