International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News item: 10 May 2021

Hanshen Xiao, Srinivas Devadas
ePrint Report ePrint Report
We tackle the problems of private learning where an owner wishes to outsource a training task to an honest-but-curious server while keeping its data private, and private collaborative learning where two (or more) mutually distrusting owners outsource respective training data sets to an honest-but-curious server while keeping their data sets private from the server and each other.

The privacy property we provide is information-theoretic in nature, Probably Approximately Correct (PAC) approximation resistance (abbreviated to PAC security). Each owner transforms its data and labels using a private transform. The server combines samples from each data set into expanded samples with corresponding expanded labels -- we refer to this step as Task Augmentation. The server can be used for inference by any owner by sending it transformed samples. Unlike most prior approaches, our transformed data approach maintains privacy for each entity, even in the case where the server colludes with all other entities. Importantly, we show the utility of collaborative learning typically exceeds the utility that can be achieved by any entity restricted to its own data set.

Another important application we show is that the Task Augmentation approach can also be used in the single owner case by adding labeled, learnable noise to amplify privacy. This can be straightforwardly used to produce (Local) Differential Privacy ((L)DP) guarantees. We show that adding labeled noise as opposed to a conventional (L)DP additive noise mechanism significantly improves the privacy-utility tradeoff in private learning under the same setup.
Expand

Additional news items may be found on the IACR news page.