Enhancing Programming Error Messages in Real Time with Generative AI

Enhancing Programming Error Messages in Real Time with Generative AI

May 11–16, 2024 | Bailey Kimmel, Austin Geisert, Lily Yaro, Brendan Gipson, Taylor Hotchkiss, Sidney Osae-Asante, Hunter Vaught, Grant Wininger, Chase Yamaguchi
This paper explores the integration of generative AI into an automated assessment tool (AAT) for a CS1 course to provide real-time feedback on programming errors, including compiler, run-time, and logic errors. The study involved 52 students in the Fall 2023 semester at Abilene Christian University, who were given a programming assignment called "Prime Factorization" with AI feedback from ChatGPT. The results showed that students submitted their code more frequently than in previous years, indicating that the AI feedback did not necessarily improve the tool's effectiveness. However, students had mixed reactions to the feedback, with some finding it helpful and others finding it vague or incorrect. The study highlights the importance of designing interfaces that allow for meaningful interaction with AI feedback, as students prefer to engage in a conversation with the AI rather than receive a single hint. The findings suggest that while generative AI can enhance programming error messages, the design of the interface and the way feedback is provided significantly impact its usability and effectiveness. The study also emphasizes the need for further research into designing interfaces that allow for limited in-context follow-up to AI submission feedback.This paper explores the integration of generative AI into an automated assessment tool (AAT) for a CS1 course to provide real-time feedback on programming errors, including compiler, run-time, and logic errors. The study involved 52 students in the Fall 2023 semester at Abilene Christian University, who were given a programming assignment called "Prime Factorization" with AI feedback from ChatGPT. The results showed that students submitted their code more frequently than in previous years, indicating that the AI feedback did not necessarily improve the tool's effectiveness. However, students had mixed reactions to the feedback, with some finding it helpful and others finding it vague or incorrect. The study highlights the importance of designing interfaces that allow for meaningful interaction with AI feedback, as students prefer to engage in a conversation with the AI rather than receive a single hint. The findings suggest that while generative AI can enhance programming error messages, the design of the interface and the way feedback is provided significantly impact its usability and effectiveness. The study also emphasizes the need for further research into designing interfaces that allow for limited in-context follow-up to AI submission feedback.
Reach us at info@study.space
Understanding Enhancing Programming Error Messages in Real Time with Generative AI