Since SARS-CoV-2 started spreading in Europe in early 2020, there has been a strong call for technical solutions to combat or contain the pandemic, with contact tracing apps at the heart of the debates. The EU's General Data Protection Regulation (GDPR) requires controllers to carry out a data protection impact assessment (DPIA) where their data processing is likely to result in a high risk to the rights and freedoms (Art. 35 GDPR). A DPIA is a structured risk analysis that identifies and evaluates possible consequences of data processing relevant to fundamental rights in advance and describes the measures envisaged to address these risks or expresses the inability to do so. Based on the Standard Data Protection Model (SDM), we present the results of a scientific and methodologically clear DPIA of the German German Corona-Warn-App (CWA). It shows that even a decentralized architecture involves numerous serious weaknesses and risks, including larger ones still left unaddressed in current implementations. It also found that none of the proposed designs operates on anonymous data or ensures proper anonymisation. It also showed that informed consent would not be a legitimate legal ground for the processing. For all points where data subjects' rights are still not sufficiently safeguarded, we briefly outline solutions.
翻译:自2020年初SARS-CoV-2在欧洲开始传播以来,技术解决方案(尤其是接触者追踪应用程序)成为抗击疫情讨论的核心。欧盟《通用数据保护条例》(GDPR)要求数据控制者在其数据处理可能对权利与自由构成高风险时(GDPR第35条),必须执行数据保护影响评估(DPIA)。DPIA是一种结构化风险分析方法,旨在预先识别和评估数据处理对基本权利可能产生的后果,并描述应对这些风险的预定措施或说明无法采取行动的原因。基于标准数据保护模型(SDM),我们对德国Corona-Warn-App(CWA)进行了科学且方法学明确的DPIA研究。结果表明,即使是去中心化架构也存在大量严重缺陷和风险,包括当前实现中尚未解决的重大隐患。研究同时发现,所有现有设计方案均未实现真正的匿名数据处理或确保适当的匿名化处理。此外,研究证实知情同意不能作为此类处理的合法法律依据。针对数据主体权利尚未得到充分保障的各个环节,我们简要提出了相应的解决方案。