Debugging with Stack Overflow: ICSE SEET, 2022
This is the GitHub repository associated with the 2022 ICSE SEET paper, Debugging with Stack Overflow: Web Search Behavior in Novice and Expert Programmers
Abstract
Debugging can be challenging for novice and expert programmers alike. Programmers routinely turn to online resources such as Stack Overflow for help, but understanding of debugging search practices, as well as tool support to find debugging resources, remains limited. Existing tools that mine online help forums are generally not aimed at novices, and programmers face varying levels of success when looking for online resources. Furthermore, training online code search skills is pedagogically challenging, as we have little understanding of how expertise impacts programmers' web search behavior while debugging code.
We help fill these knowledge gaps with the results of a study of 40 programmers investigating differences in Stack Overflow search behavior at three levels of expertise: novices, experienced programmers who are novices in Python (the language we use in our study), and experienced Python programmers. We observe significant differences between all three levels in their ability to find posts helpful for debugging a given error, with both general and language-specific expertise facilitating Stack Overflow search efficacy and debugging success. We also conduct an exploratory investigation of factors that correlate with this difference, such as the display rank of the selected link and the number of links checked per search query. We conclude with an analysis of how online search behavior and results vary by Python error type. Our findings can inform online code search pedagogy, as well as inform the development of future automated tools.
Authors
Paper Link
COMING SOON
Paper Citation
COMING SOON
Repository Contents
- Stimuli: Contains images of the programming bugs used as stimuli in our experiment (text versions of the stimuli are in the Survey Instrument if needed)
- Analysis: Includes both details regarding the manual annotation process and our analysis scripts
- Survey Instrument: Contains word and qualtrics versions of the survey instrument used to show the stimuli to participants
- Recruitment: Contains our consent form and prescreening survey