Abstract or Summary |
- Assessing and understanding intelligent agents can be a difficult task for users who may
lack an artificial intelligence (AI) background. A relatively new area, called “explainable
AI,” is emerging to help address this problem, but little is known about how to present
and structure information that an explanation system might offer. To inform the
development of explainable AI systems, we analyzed the “supply” of explanations that
experts provided in the real-time strategy domain and conducted an information
foraging theory based study to determine if these explanations meet the “demand” of
experienced users. Our results showed some consistency between explanations experts offer and what system users demand. We also found foraging problems, however, that
caused participants to entirely miss important events and reluctantly ignore other
actions, resulting in high cognitive, navigation, and information costs to access the
information they needed.
|