Smart home systems are gaining popularity as homeowners strive to enhance their living and working environments while minimizing energy consumption. However, the adoption of artificial intelligence (AI)-enabled decision-making models in smart home systems faces challenges due to the complexity and black-box nature of these systems, leading to concerns about explainability, trust, transparency, accountability, and fairness. The emerging field of explainable artificial intelligence (XAI) addresses these issues by providing explanations for the models' decisions and actions. While state-of-the-art XAI methods are beneficial for AI developers and practitioners, they may not be easily understood by general users, particularly household members. This paper advocates for human-centered XAI methods, emphasizing the importance of delivering readily comprehensible explanations to enhance user satisfaction and drive the adoption of smart home systems. We review state-of-the-art XAI methods and prior studies focusing on human-centered explanations for general users in the context of smart home applications. Through experiments on two smart home application scenarios, we demonstrate that explanations generated by prominent XAI techniques might not be effective in helping users understand and make decisions. We thus argue for the necessity of a human-centric approach in representing explanations in smart home systems and highlight relevant human-computer interaction (HCI) methodologies, including user studies, prototyping, technology probes analysis, and heuristic evaluation, that can be employed to generate and present human-centered explanations to users.
翻译:智能家居系统正随着房主努力提升生活和工作环境并最小化能耗而日益普及。然而,人工智能赋能的决策模型在智能家居系统中的采用面临挑战,原因是这些系统的复杂性和黑箱特性引发了关于可解释性、信任、透明性、问责性和公平性的担忧。新兴的可解释人工智能领域通过为模型的决策和行为提供解释来应对这些问题。尽管最先进的可解释人工智能方法对人工智能开发者和从业者有益,但普通用户(尤其是家庭成员)可能难以理解。本文倡导以人为本的可解释人工智能方法,强调提供易于理解解释的重要性,以提升用户满意度并推动智能家居系统的采用。我们回顾了最先进的可解释人工智能方法以及先前聚焦于智能家居应用背景下为普通用户提供以人为本解释的研究。通过在两个智能家居应用场景上的实验,我们证明由主流可解释人工智能技术生成的解释可能无法有效帮助用户理解和做出决策。因此,我们主张在智能家居系统中解释的呈现上需要采用以人为中心的方法,并强调相关的人机交互方法论,包括用户研究、原型设计、技术探针分析和启发式评估,可用于生成和向用户呈现以人为本的解释。