2024 | Kathryn A. Fuller, Kathryn A. Morbitzer, Jacqueline M. Zeeman, Adam M. Persky, Amanda C. Savage, Jacqueline E. McLaughlin
This study explores the use of ChatGPT to analyze student course evaluation comments in health professions education. The primary objectives were to assess the time required for generating themes and the level of agreement between instructor-identified and AI-identified themes. Four instructors from the University of North Carolina Eshelman School of Pharmacy independently analyzed student comments using five prompts, noting the time and process. The comments were also analyzed by two independent Open-AI ChatGPT user accounts. Thematic analysis was used to compare the themes identified by instructors and ChatGPT. Results showed high agreement between instructors and ChatGPT, particularly for course-related topics, with the lowest agreement for identifying course weaknesses. Instructors took an average of 27.50 ± 15.00 minutes to analyze their data, while ChatGPT users took significantly less time (10.50 ± 1.00 minutes and 12.50 ± 2.89 minutes). Instructors reported feeling anxious before analyzing the comments, satisfied during the process, and frustrated with the findings. The study concludes that ChatGPT can effectively assist in analyzing student course evaluation comments, reducing the workload for instructors and providing more detailed insights. However, it emphasizes the importance of using ChatGPT as a tool to complement human analysis rather than relying solely on its outputs.This study explores the use of ChatGPT to analyze student course evaluation comments in health professions education. The primary objectives were to assess the time required for generating themes and the level of agreement between instructor-identified and AI-identified themes. Four instructors from the University of North Carolina Eshelman School of Pharmacy independently analyzed student comments using five prompts, noting the time and process. The comments were also analyzed by two independent Open-AI ChatGPT user accounts. Thematic analysis was used to compare the themes identified by instructors and ChatGPT. Results showed high agreement between instructors and ChatGPT, particularly for course-related topics, with the lowest agreement for identifying course weaknesses. Instructors took an average of 27.50 ± 15.00 minutes to analyze their data, while ChatGPT users took significantly less time (10.50 ± 1.00 minutes and 12.50 ± 2.89 minutes). Instructors reported feeling anxious before analyzing the comments, satisfied during the process, and frustrated with the findings. The study concludes that ChatGPT can effectively assist in analyzing student course evaluation comments, reducing the workload for instructors and providing more detailed insights. However, it emphasizes the importance of using ChatGPT as a tool to complement human analysis rather than relying solely on its outputs.