Empowering Personalized Learning through a Conversation-based Tutoring System with Student Modeling

Empowering Personalized Learning through a Conversation-based Tutoring System with Student Modeling

May 11-16, 2024 | Minju Park, Sojung Kim, Seunghyun Lee, Soonwoo Kwon, Kyuseok Kim
This paper presents a conversation-based tutoring system with student modeling to empower personalized learning. The system integrates diagnostic components for student assessment and leverages large language models (LLMs) with prompt engineering to incorporate assessment outcomes and instructional strategies into teaching. The system was tested with 20 participants, and results show that it effectively facilitates personalization, particularly in student modeling. The system includes a web demo at http://rlearning-its.com. The system focuses on three English writing concepts: Pronouns, Punctuation, and Transitions, from the SAT Writing test. Users progress through an onboarding survey, pre-test, tutoring session, and post-test for each concept. The system assesses students based on three criteria: cognitive state, affective state, and learning style. Cognitive state includes proficiency levels, metacognition, and learning gain. Affective state involves self-reported data and GPT-4 session-end summaries. Learning style is determined through self-reported data and GPT-4 analysis. The system uses an adaptive exercise selection strategy based on the Item Response Theory (IRT) model. Exercises are selected based on the student's skill parameter, which is calculated from pre-test results. The system also employs a dual-prompt approach, including a base prompt and personalized prompt, to tailor instruction to the student's needs. The personalized prompt is updated based on the student's learning style, cognitive state, and affective state. The system's prompt design includes a cyclical framework that integrates student assessment and system prompt. The system prompt is refined based on the student's interactions and summaries, allowing for continuous improvement. The system also includes a summary prompt that evaluates student interactions and suggests action items for subsequent tutoring sessions. The results of the system's implementation show that it successfully adjusts instructional strategies in a personalized manner. The system's effectiveness in analyzing and evaluating student states based on dialogues is demonstrated through examples of personalized teaching techniques. However, the system has limitations, such as the need for more engaging content to maintain student engagement and the challenge of aligning post-test questions with tutoring content. Future work aims to refine the system and explore its effects on learning gain and engagement.This paper presents a conversation-based tutoring system with student modeling to empower personalized learning. The system integrates diagnostic components for student assessment and leverages large language models (LLMs) with prompt engineering to incorporate assessment outcomes and instructional strategies into teaching. The system was tested with 20 participants, and results show that it effectively facilitates personalization, particularly in student modeling. The system includes a web demo at http://rlearning-its.com. The system focuses on three English writing concepts: Pronouns, Punctuation, and Transitions, from the SAT Writing test. Users progress through an onboarding survey, pre-test, tutoring session, and post-test for each concept. The system assesses students based on three criteria: cognitive state, affective state, and learning style. Cognitive state includes proficiency levels, metacognition, and learning gain. Affective state involves self-reported data and GPT-4 session-end summaries. Learning style is determined through self-reported data and GPT-4 analysis. The system uses an adaptive exercise selection strategy based on the Item Response Theory (IRT) model. Exercises are selected based on the student's skill parameter, which is calculated from pre-test results. The system also employs a dual-prompt approach, including a base prompt and personalized prompt, to tailor instruction to the student's needs. The personalized prompt is updated based on the student's learning style, cognitive state, and affective state. The system's prompt design includes a cyclical framework that integrates student assessment and system prompt. The system prompt is refined based on the student's interactions and summaries, allowing for continuous improvement. The system also includes a summary prompt that evaluates student interactions and suggests action items for subsequent tutoring sessions. The results of the system's implementation show that it successfully adjusts instructional strategies in a personalized manner. The system's effectiveness in analyzing and evaluating student states based on dialogues is demonstrated through examples of personalized teaching techniques. However, the system has limitations, such as the need for more engaging content to maintain student engagement and the challenge of aligning post-test questions with tutoring content. Future work aims to refine the system and explore its effects on learning gain and engagement.
Reach us at info@study.space