Intensive Care Units (ICU) provide close supervision and continuous care to patients with life-threatening conditions. However, continuous patient assessment in the ICU is still limited due to time constraints and the workload on healthcare providers. Existing patient assessments in the ICU such as pain or mobility assessment are mostly sporadic and administered manually, thus introducing the potential for human errors. Developing Artificial intelligence (AI) tools that can augment human assessments in the ICU can be beneficial for providing more objective and granular monitoring capabilities. For example, capturing the variations in a patient's facial cues related to pain or agitation can help in adjusting pain-related medications or detecting agitation-inducing conditions such as delirium. Additionally, subtle changes in visual cues during or prior to adverse clinical events could potentially aid in continuous patient monitoring when combined with high-resolution physiological signals and Electronic Health Record (EHR) data. In this paper, we examined the association between visual cues and patient condition including acuity status, acute brain dysfunction, and pain. We leveraged our AU-ICU dataset with 107,064 frames collected in the ICU annotated with facial action units (AUs) labels by trained annotators. We developed a new "masked loss computation" technique that addresses the data imbalance problem by maximizing data resource utilization. We trained the model using our AU-ICU dataset in conjunction with three external datasets to detect 18 AUs. The SWIN Transformer model achieved 0.57 mean F1-score and 0.89 mean accuracy on the test set. Additionally, we performed AU inference on 634,054 frames to evaluate the association between facial AUs and clinically important patient conditions such as acuity status, acute brain dysfunction, and pain.
翻译:重症监护室(ICU)为生命垂危的患者提供密切监护与持续照护。然而,由于时间限制及医护人员的工作负荷,ICU中的连续性患者评估仍存在局限。当前ICU中的疼痛或活动能力等患者评估多为间歇性人工操作,因而可能引入人为误差。开发能够增强ICU人工评估能力的人工智能(AI)工具有助于提供更客观、更精细的监护功能。例如,捕捉患者面部表情中与疼痛或躁动相关的细微变化,可辅助调整镇痛药物或识别谵妄等诱发躁动的临床状况。此外,当视觉线索的细微变化与高分辨率生理信号及电子健康记录(EHR)数据相结合时,可能在临床不良事件发生期间或发生前为连续性患者监护提供支持。本文研究了视觉线索与患者状况(包括危重程度、急性脑功能障碍及疼痛)之间的关联。我们利用AU-ICU数据集(包含由训练有素的标注员标注面部动作单元(AUs)的107,064帧ICU采集图像),开发了一种通过最大化数据资源利用率以应对数据不平衡问题的"掩码损失计算"新技术。结合三个外部数据集,我们使用AU-ICU数据集训练了可检测18种AUs的模型。SWIN Transformer模型在测试集上取得了0.57的平均F1分数和0.89的平均准确率。此外,我们对634,054帧图像进行AU推理,以评估面部AUs与危重程度、急性脑功能障碍及疼痛等重要临床状况之间的关联性。