International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News item: 30 August 2025

Qingyu Mo, Wenyuan Wu, Jingwei Chen
ePrint Report ePrint Report
Privacy-preserving machine learning (PPML) is a powerful tool for multiple parties to collaboratively train a model or perform model inference without exposing their private data in the context of Internet of things. A key challenge in PPML is the efficient evaluation of non-polynomial functions. In this work, we propose NASE, a neat and accurate secure exponentiation protocol for radius basis function (RBF) kernel evaluation. Leveraging the property of the RBF kernel, NASE enjoys a lightweight construction that reduces computation overhead by up to 1.65$\times$ and communication overhead by up to 3.97$\times$ compared to SIRNN, the prior SOTA framework for secure exponentiation published in IEEE S\&P 2021.

Taking NASE as the foundation stone, we propose a privacy-preserving two-party kernel SVM training protocol. Based on BFV scheme and MPC technique, we introduce group-batch sampling for sampling in ciphertext and propose the partial rotation method tailored to our scenario to optimize dot product computation. Additionally, we propose an error-tolerant $DReLU$ protocol for secure sign evaluation of secret sharings over a prime field that reduces the communication cost by around $\frac{1}{3}$ compared to the existing method. Our protocol achieves model accuracy comparable to plaintext training according to experiments on real-world datasets, and an order-of-magnitude reduction in both communication and computation overhead is attained compared to the previous work.
Expand

Additional news items may be found on the IACR news page.