Delayed diagnosis of a transient ischemic attack caused by ChatGPT

Delayed diagnosis of a transient ischemic attack caused by ChatGPT

2 February 2024 | Jonathan A. Saenger, Jonathan Hunger, Andreas Boss, Johannes Richter
This article presents a case study where a 63-year-old man experienced delayed diagnosis of a transient ischemic attack (TIA) due to an erroneous diagnosis provided by ChatGPT. The patient, who had undergone pulmonary vein isolation (PVI) for atrial fibrillation, consulted ChatGPT after experiencing multiple episodes of diplopia. Despite the patient's concerns, ChatGPT classified the symptoms as "possible" after PVI, leading the patient to believe that he did not need to seek medical attention. However, after a third episode, the patient called an ambulance and was admitted to the emergency department. Initial examinations were unremarkable, but subsequent tests revealed no signs of acute infarction. The patient was eventually diagnosed with a TIA and received appropriate treatment. The case highlights the risks associated with relying on AI tools, such as ChatGPT, for medical advice, including the potential for incorrect or misleading information and the need for further medical evaluation. The authors emphasize the importance of integrating AI in healthcare to complement rather than replace medical professionals, ensuring that AI systems are developed with medical professionals' involvement and adhere to ethical and legal standards.This article presents a case study where a 63-year-old man experienced delayed diagnosis of a transient ischemic attack (TIA) due to an erroneous diagnosis provided by ChatGPT. The patient, who had undergone pulmonary vein isolation (PVI) for atrial fibrillation, consulted ChatGPT after experiencing multiple episodes of diplopia. Despite the patient's concerns, ChatGPT classified the symptoms as "possible" after PVI, leading the patient to believe that he did not need to seek medical attention. However, after a third episode, the patient called an ambulance and was admitted to the emergency department. Initial examinations were unremarkable, but subsequent tests revealed no signs of acute infarction. The patient was eventually diagnosed with a TIA and received appropriate treatment. The case highlights the risks associated with relying on AI tools, such as ChatGPT, for medical advice, including the potential for incorrect or misleading information and the need for further medical evaluation. The authors emphasize the importance of integrating AI in healthcare to complement rather than replace medical professionals, ensuring that AI systems are developed with medical professionals' involvement and adhere to ethical and legal standards.
Reach us at info@study.space
[slides and audio] Delayed diagnosis of a transient ischemic attack caused by ChatGPT