This research presents a novel multimodal data fusion methodology for pain behavior recognition, integrating statistical correlation analysis with human-centered insights. Our approach introduces two key innovations: 1) integrating data-driven statistical relevance weights into the fusion strategy to effectively utilize complementary information from heterogeneous modalities, and 2) incorporating human-centric movement characteristics into multimodal representation learning for detailed modeling of pain behaviors. Validated across various deep learning architectures, our method demonstrates superior performance and broad applicability. We propose a customizable framework that aligns each modality with a suitable classifier based on statistical significance, advancing personalized and effective multimodal fusion. Furthermore, our methodology provides explainable analysis of multimodal data, contributing to interpretable and explainable AI in healthcare. By highlighting the importance of data diversity and modality-specific representations, we enhance traditional fusion techniques and set new standards for recognizing complex pain behaviors. Our findings have significant implications for promoting patient-centered healthcare interventions and supporting explainable clinical decision-making.
翻译:本研究提出了一种新颖的多模态疼痛行为识别数据融合方法,该方法将统计相关性分析与以人为本的洞察相结合。我们的方法引入了两大创新点:1)将数据驱动的统计相关性权重融入融合策略,以有效利用异构模态的互补信息;2)将人本运动特征纳入多模态表征学习,实现对疼痛行为的精细化建模。经多种深度学习架构验证,本方法展现出卓越的性能与广泛的适用性。我们提出了一种可定制化框架,能够依据统计显著性为各模态匹配合适的分类器,从而推动个性化且高效的多模态融合。此外,本方法提供了多模态数据的可解释性分析,有助于实现医疗健康领域可解释的人工智能。通过强调数据多样性与模态特异性表征的重要性,我们改进了传统融合技术,并为识别复杂疼痛行为设立了新标准。本研究对于推动以患者为中心的医疗干预措施及支持可解释的临床决策具有重要启示。