Deep learning models for radiology interpretation increasingly rely on multi-institutional data, yet privacy regulations and distribution shift across hospitals limit central data pooling. Federated learning (FL) allows hospitals to collaboratively train models without sharing raw images, but current FL algorithms typically assume a static data distribution. In practice, hospitals experience continual evolution in case mix, annotation protocols, and imaging devices, which leads to catastrophic forgetting when models are updated sequentially. Federated continual learning (FCL) aims to reconcile these challenges but existing methods either ignore the stringent privacy constraints of healthcare or rely on replay buffers and public surrogate datasets that are difficult to justify in clinical settings. We study FCL for chest radiography classification in a setting where hospitals are clients that receive temporally evolving streams of cases and labels. We introduce DP-FedEPC (Differentially Private Federated Elastic Prototype Consolidation), a method that combines elastic weight consolidation (EWC), prototype-based rehearsal, and client-side differential privacy within a standard FedAvg framework. EWC constrains updates along parameters deemed important for previous tasks, while a memory of latent prototypes preserves class structure without storing raw images. Differentially private stochastic gradient descent (DP-SGD) at each client adds calibrated Gaussian noise to clipped gradients, providing formal privacy guarantees for individual radiographs.
翻译:用于放射学判读的深度学习模型日益依赖于多机构数据,然而隐私法规和医院间的分布偏移限制了中心化数据池的构建。联邦学习(FL)允许医院在不共享原始图像的情况下协作训练模型,但现有FL算法通常假设数据分布是静态的。实际上,医院的病例构成、标注协议和成像设备都在持续演变,这导致模型在顺序更新时出现灾难性遗忘。联邦持续学习(FCL)旨在解决这些挑战,但现有方法要么忽视了医疗领域严格的隐私约束,要么依赖于在临床环境中难以合理实施的回放缓冲区和公共替代数据集。本研究针对胸片分类任务,在医疗机构作为客户端接收随时间演变的病例流和标注流的场景下,探索联邦持续学习方案。我们提出了DP-FedEPC(差分隐私联邦弹性原型巩固)方法,该方法在标准FedAvg框架中融合了弹性权重巩固(EWC)、基于原型的记忆重放和客户端差分隐私技术。EWC通过约束对先前任务重要参数的更新,而潜在原型记忆库则在不存储原始图像的情况下保持类别结构。每个客户端采用差分隐私随机梯度下降(DP-SGD),对裁剪后的梯度添加经过校准的高斯噪声,为个体放射影像提供形式化的隐私保障。