Secure training, while protecting the confidentiality of both data and model weights, typically incurs significant training overhead. Traditional Fully Homomorphic Encryption (FHE)-based non-inter-active training models are heavily burdened by computationally demanding bootstrapping. To develop an efficient secure training system, we established a foundational framework, CryptoTrain-B, utilizing a hybrid cryptographic protocol that merges FHE with Oblivious Transfer (OT) for handling linear and non-linear operations, respectively. This integration eliminates the need for costly bootstrapping. Although CryptoTrain-B sets a new baseline in performance, reducing its training overhead remains essential. We found that ciphertext-ciphertext multiplication (CCMul) is a critical bottleneck in operations involving encrypted inputs and models. Our solution, the CCMul-Precompute technique, involves precomputing CCMul offline and resorting to the less resource-intensive ciphertext-plaintext multiplication (CPMul) during private training. Furthermore, conventional polynomial convolution in FHE systems tends to encode irrelevant and redundant values into polynomial slots, necessitating additional polynomials and ciphertexts for input representation and leading to extra multiplications. Addressing this, we introduce correlated polynomial convolution, which encodes only related input values into polynomials, thus drastically reducing the number of computations and overheads. By integrating CCMul-Precompute and correlated polynomial convolution into CryptoTrain-B, we facilitate a rapid and efficient secure training framework, CryptoTrain. Extensive experiments demonstrate that CryptoTrain achieves a ~5.3X training time reduction compared to prior methods.
翻译:安全训练在保护数据和模型权重机密性的同时,通常会产生显著的训练开销。传统的基于全同态加密(FHE)的非交互式训练模型受计算密集的引导操作所累。为开发高效的安全训练系统,我们建立了一个基础框架CryptoTrain-B,它采用一种混合密码协议,该协议分别利用FHE与不经意传输(OT)处理线性和非线性操作。这种集成消除了对昂贵引导操作的需求。尽管CryptoTrain-B在性能上确立了新的基准,进一步降低其训练开销仍然至关重要。我们发现,密文-密文乘法(CCMul)是涉及加密输入和模型的操作中的关键瓶颈。我们的解决方案——CCMul预计算技术——涉及离线预计算CCMul,并在私有训练期间转而使用资源密集度较低的密文-明文乘法(CPMul)。此外,FHE系统中传统的多项式卷积倾向于将不相关和冗余的值编码到多项式槽中,这需要额外的多项式和密文进行输入表示,并导致额外的乘法运算。针对此问题,我们引入了相关多项式卷积,它仅将相关的输入值编码到多项式中,从而大幅减少了计算量和开销。通过将CCMul预计算和相关多项式卷积集成到CryptoTrain-B中,我们实现了一个快速高效的安全训练框架——CryptoTrain。大量实验表明,与现有方法相比,CryptoTrain实现了约5.3倍的训练时间缩减。