Pain management and severity detection are crucial for effective treatment, yet traditional self-reporting methods are subjective and may be unsuitable for non-verbal individuals (people with limited speaking skills). To address this limitation, we explore automated pain detection using facial expressions. Our study leverages deep learning techniques to improve pain assessment by analyzing facial images from the Pain Emotion Faces Database (PEMF). We propose two novel approaches1: (1) a hybrid ConvNeXt model combined with Long Short-Term Memory (LSTM) blocks to analyze video frames and predict pain presence, and (2) a Spatio-Temporal Graph Convolution Network (STGCN) integrated with LSTM to process landmarks from facial images for pain detection. Our work represents the first use of the PEMF dataset for binary pain classification and demonstrates the effectiveness of these models through extensive experimentation. The results highlight the potential of combining spatial and temporal features for enhanced pain detection, offering a promising advancement in objective pain assessment methodologies.
翻译:疼痛管理与严重程度检测对于有效治疗至关重要,然而传统的自我报告方法具有主观性,且可能不适用于非言语个体(语言能力受限者)。为应对这一局限,本研究探索利用面部表情进行自动疼痛检测。我们通过分析来自疼痛情绪面孔数据库(PEMF)的面部图像,利用深度学习技术改进疼痛评估。我们提出两种创新方法1:(1)结合ConvNeXt模型与长短期记忆(LSTM)模块的混合架构,用于分析视频帧并预测疼痛存在;(2)集成时空图卷积网络(STGCN)与LSTM,处理面部图像关键点以实现疼痛检测。本研究首次将PEMF数据集应用于二元疼痛分类任务,并通过大量实验验证了这些模型的有效性。结果凸显了结合空间与时间特征对于提升疼痛检测能力的潜力,为客观疼痛评估方法学提供了具有前景的进展。