When Do People Want an Explanation from a Robot?

When Do People Want an Explanation from a Robot?

March 11–14, 2024, Boulder, CO, USA | Lennart Wachowiak, Andrew Fenn, Haris Kamran, Andrew Coles, Oya Celiktutan, Gerard Canal
This study investigates when and in what interaction contexts users most desire explanations from robots. The research, conducted with 186 participants, involved showing them 16 videos depicting seven distinct situation types, ranging from successful human-robot interactions to robot errors and inabilities. Participants were then asked to indicate their preferences for how the robot should communicate subsequent to the interaction. The findings identify specific scenarios where users strongly need explanations, such as when the robot encounters unforeseen circumstances, is unable to fulfill a task, makes an error, violates a social norm, acts uncertainly, or operates suboptimally. Why-explanations, which provide reasons for the robot's actions, were consistently among the two highest-rated response options across these scenarios, except when the robot acts normally or successfully. Apologies and asking for help were also highly valued in certain situations, such as when the robot violates a social norm or is uncertain. The study did not find significant correlations between robotics experience, attitudes towards robots, age, or gender with response preferences. However, some trends were observed, such as slightly higher ratings for apologies when participants had more robotics experience. Qualitative insights from think-aloud sessions provided additional context on why certain response preferences were given. The results have implications for designing more user-centered and transparent interactions, as well as for developing more targeted explanations in explainable AI (XAI). The findings suggest that robots should provide why-explanations in scenarios where they encounter unexpected events or are unable to complete tasks, while other response types like apologies and asking for help are more appropriate in specific contexts.This study investigates when and in what interaction contexts users most desire explanations from robots. The research, conducted with 186 participants, involved showing them 16 videos depicting seven distinct situation types, ranging from successful human-robot interactions to robot errors and inabilities. Participants were then asked to indicate their preferences for how the robot should communicate subsequent to the interaction. The findings identify specific scenarios where users strongly need explanations, such as when the robot encounters unforeseen circumstances, is unable to fulfill a task, makes an error, violates a social norm, acts uncertainly, or operates suboptimally. Why-explanations, which provide reasons for the robot's actions, were consistently among the two highest-rated response options across these scenarios, except when the robot acts normally or successfully. Apologies and asking for help were also highly valued in certain situations, such as when the robot violates a social norm or is uncertain. The study did not find significant correlations between robotics experience, attitudes towards robots, age, or gender with response preferences. However, some trends were observed, such as slightly higher ratings for apologies when participants had more robotics experience. Qualitative insights from think-aloud sessions provided additional context on why certain response preferences were given. The results have implications for designing more user-centered and transparent interactions, as well as for developing more targeted explanations in explainable AI (XAI). The findings suggest that robots should provide why-explanations in scenarios where they encounter unexpected events or are unable to complete tasks, while other response types like apologies and asking for help are more appropriate in specific contexts.
Reach us at info@study.space
[slides] When Do People Want an Explanation from a Robot%3F | StudySpace