Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

2024 | Daria Shevtsova, Anam Ahmed, Iris W A Boot, Carmen Sanges, Michael Hudecek, John J L Jacobs, Simon Hort, Hubertus J M Vrijhoeve
This study explores factors influencing trust and acceptance of AI applications in medicine through a mixed-methods approach. A rapid literature review identified 110 factors related to trust and 77 factors related to acceptance of AI in medicine. These factors were grouped into four categories: human-related, technology-related, legal and ethical, and additional factors. A survey of 22 stakeholders (researchers, technology providers, hospital staff, and policymakers) assessed the relevance of these factors. Results showed that 16 out of 19 factors were considered highly relevant to trust and acceptance, while three factors (patient gender, age, and education level) were deemed low relevance. Key factors included the type of organization AI professionals belong to, the transparency and explainability of AI applications, and the adequacy of regulations and governance. Technology-related factors such as performance, integration into clinical workflows, and risk-benefit balance were also highly relevant. Legal and ethical factors, including data use transparency and accountability, were considered important. Additional factors, such as environmental sustainability and job impact, were also relevant. The study highlights the importance of trust and acceptance in the successful implementation of AI in medicine. It suggests that stakeholders should focus on factors like transparency, explainability, and regulatory frameworks to build trust and acceptance. The findings can help implementers of medical AI applications understand what drives trust and acceptance among key stakeholders, enabling them to develop strategies that facilitate the adoption of AI in healthcare. The study also identifies the need for further research into the perspectives of underrepresented stakeholder groups and the factors that influence trust and acceptance in AI applications.This study explores factors influencing trust and acceptance of AI applications in medicine through a mixed-methods approach. A rapid literature review identified 110 factors related to trust and 77 factors related to acceptance of AI in medicine. These factors were grouped into four categories: human-related, technology-related, legal and ethical, and additional factors. A survey of 22 stakeholders (researchers, technology providers, hospital staff, and policymakers) assessed the relevance of these factors. Results showed that 16 out of 19 factors were considered highly relevant to trust and acceptance, while three factors (patient gender, age, and education level) were deemed low relevance. Key factors included the type of organization AI professionals belong to, the transparency and explainability of AI applications, and the adequacy of regulations and governance. Technology-related factors such as performance, integration into clinical workflows, and risk-benefit balance were also highly relevant. Legal and ethical factors, including data use transparency and accountability, were considered important. Additional factors, such as environmental sustainability and job impact, were also relevant. The study highlights the importance of trust and acceptance in the successful implementation of AI in medicine. It suggests that stakeholders should focus on factors like transparency, explainability, and regulatory frameworks to build trust and acceptance. The findings can help implementers of medical AI applications understand what drives trust and acceptance among key stakeholders, enabling them to develop strategies that facilitate the adoption of AI in healthcare. The study also identifies the need for further research into the perspectives of underrepresented stakeholder groups and the factors that influence trust and acceptance in AI applications.
Reach us at info@study.space