Performance of Generative Artificial Intelligence in Dental Licensing Examinations

Performance of Generative Artificial Intelligence in Dental Licensing Examinations

2024 | Reinhart Chun Wang Chau, Khaing Myat Thu, Ollie Yiru Yu, Richard Tai-Chiu Hsung, Edward Chin Man Lo, Walter Yu Hang Lam
This study evaluates the performance of two versions of generative artificial intelligence (GenAI), ChatGPT 3.5 and ChatGPT 4.0, in answering questions from dental licensing examinations. The research aimed to assess the accuracy of GenAI in interpreting written input and providing correct answers in the field of dentistry. A total of 1461 multiple-choice questions from the US and UK dental licensing examinations were analyzed. The passing rates for these examinations were 75.0% and 50.0%, respectively. ChatGPT 3.5 correctly answered 68.3% (n = 509) and 43.3% (n = 296) of the questions from the US and UK examinations, respectively. ChatGPT 4.0 scored higher, with 80.7% (n = 601) and 62.7% (n = 429) correct answers, respectively. ChatGPT 4.0 passed both written examinations, while ChatGPT 3.5 failed. ChatGPT 4.0 answered 327 more questions correctly and 102 incorrectly compared to ChatGPT 3.5. The study found that the newer version of GenAI, ChatGPT 4.0, demonstrated better performance in answering dental licensing examination questions. However, the results may not be universally applicable, and further improvements are needed. The use of GenAI in dentistry could significantly impact dentist-patient communication and the training of dental professionals. The study also highlights the potential of GenAI in dental education and patient management. However, there are limitations, including the use of a single GenAI model and the potential for future changes in performance. The study suggests that GenAI could be a valuable tool in dental practice and education, but further research is needed to explore its long-term impact and address implementation challenges. The findings indicate that GenAI can provide accurate dental information, but dental professionals need to verify the accuracy of this information and apply it to individual patients. The dental curriculum and assessment methods may need to evolve to accommodate the changes brought by GenAI.This study evaluates the performance of two versions of generative artificial intelligence (GenAI), ChatGPT 3.5 and ChatGPT 4.0, in answering questions from dental licensing examinations. The research aimed to assess the accuracy of GenAI in interpreting written input and providing correct answers in the field of dentistry. A total of 1461 multiple-choice questions from the US and UK dental licensing examinations were analyzed. The passing rates for these examinations were 75.0% and 50.0%, respectively. ChatGPT 3.5 correctly answered 68.3% (n = 509) and 43.3% (n = 296) of the questions from the US and UK examinations, respectively. ChatGPT 4.0 scored higher, with 80.7% (n = 601) and 62.7% (n = 429) correct answers, respectively. ChatGPT 4.0 passed both written examinations, while ChatGPT 3.5 failed. ChatGPT 4.0 answered 327 more questions correctly and 102 incorrectly compared to ChatGPT 3.5. The study found that the newer version of GenAI, ChatGPT 4.0, demonstrated better performance in answering dental licensing examination questions. However, the results may not be universally applicable, and further improvements are needed. The use of GenAI in dentistry could significantly impact dentist-patient communication and the training of dental professionals. The study also highlights the potential of GenAI in dental education and patient management. However, there are limitations, including the use of a single GenAI model and the potential for future changes in performance. The study suggests that GenAI could be a valuable tool in dental practice and education, but further research is needed to explore its long-term impact and address implementation challenges. The findings indicate that GenAI can provide accurate dental information, but dental professionals need to verify the accuracy of this information and apply it to individual patients. The dental curriculum and assessment methods may need to evolve to accommodate the changes brought by GenAI.
Reach us at info@futurestudyspace.com
[slides] Performance of Generative Artificial Intelligence in Dental Licensing Examinations | StudySpace