Senior software engineer

Gamifying static analysis

Lisa Nguyen Quang Do, Eric Bodden

[pdf][proceedings]

In the past decades, static code analysis has become a prevalent means to detect bugs and security vulnerabilities in software systems. As software becomes more complex, analysis tools also report lists of increasingly complex warnings that developers need to address on a daily basis. The novel insight we present in this work is that static analysis tools and video games both require users to take on repetitive and challenging tasks. Importantly, though, while good video games manage to keep players engaged, static analysis tools are notorious for their lacking user experience, which prevents developers from using them to their full potential, frequently resulting in dissatisfaction and even tool abandonment. We show parallels between gaming and using static analysis tools, and advocate that the user-experience issues of analysis tools can be addressed by looking at the analysis tooling system as a whole, and by integrating gaming elements that keep users engaged, such as providing immediate and clear feedback, collaborative problem solving, or motivators such as points and badges.

Artifacts

  • Cognitive walkthrough questions
  • Cognitive walkthrough results:
    • The first sheet contains the evaluation of the cognitive walkthrough.
    • The second sheet contains the answers to the post-cognitive walkthrough questionnaire.
    • The third sheet contains the answers to the open-text questions of the post-cognitive walkthrough questionnaire.
    • In all sheets, comments quote the participants and clarify the answers.
@inproceedings{10.1145/3236024.3264830,
  author = {Nguyen Quang Do, Lisa and Bodden, Eric},
  title = {Gamifying Static Analysis},
  year = {2018},
  isbn = {9781450355735},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3236024.3264830},
  doi = {10.1145/3236024.3264830},
  abstract = {In the past decades, static code analysis has become a prevalent means to detect bugs and security vulnerabilities in software systems. As software becomes more complex, analysis tools also report lists of increasingly complex warnings that developers need to address on a daily basis. The novel insight we present in this work is that static analysis tools and video games both require users to take on repetitive and challenging tasks. Importantly, though, while good video games manage to keep players engaged, static analysis tools are notorious for their lacking user experience, which prevents developers from using them to their full potential, frequently resulting in dissatisfaction and even tool abandonment. We show parallels between gaming and using static analysis tools, and advocate that the user-experience issues of analysis tools can be addressed by looking at the analysis tooling system as a whole, and by integrating gaming elements that keep users engaged, such as providing immediate and clear feedback, collaborative problem solving, or motivators such as points and badges.},
  booktitle = {Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering},
  pages = {714–718},
  numpages = {5},
  keywords = {Program analysis, Gamification, Integrated Environments},
  location = {Lake Buena Vista, FL, USA},
  series = {ESEC/FSE 2018}
}