International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News item: 11 March 2015

Konrad Durnoga, Tomasz Kazana, Michał Zając, Maciej Zdanowicz
ePrint Report ePrint Report
In this paper we address the problem of large space consumption for protocols in the Bounded Retrieval Model (BRM), which require users to store large secret keys subject to adversarial leakage.

We propose a method to derive keys for such protocols on-the-fly from weakly random private data (like text documents or photos, users keep on their disks anyway for non-cryptographic purposes) in such a way that no extra storage is needed. We prove that any leakage-resilient protocol (belonging to a certain, arguably quite broad class) when run with a key obtained this way retains a similar level of security as the original protocol had. Additionally, we guarantee privacy of the data the actual keys are derived from. That is, an adversary can hardly gain any knowledge about the private data except that he could otherwise obtain via leakage. Our reduction works in the Random Oracle model.

As an important tool in the proof we use a newly established bound for min-entropy, which can be of independent interest. It may be viewed as an analogue of the chain rule -- a weaker form of the well-known formula $\\mathbf{H}(X \\vert Y) = \\mathbf{H}(X, Y) - \\mathbf{H}(Y)$ for random variables $X$, $Y$, and Shannon entropy, which our result originates from. For min-entropy only a much more limited version of this relation is known to hold. Namely, the min-entropy of $X$ may decrease by up to the bitlength of $Y$ when $X$ is conditioned on $Y$, in short: $\\widetilde{\\mathbf{H}(X \\vert Y) \\geq \\mathbf{H}_\\infty(X) - \\lvert Y\\rvert$.

In many cases this inequality does not offer tight bounds, and such significant entropy loss makes it inadequate for our particular application. In the quasi chain rule we propose, we inject some carefully crafted side information (spoiling knowledge) to show that with large probability the average min-entropy of $X$ conditioned on both: $Y$ and this side information can be almost lower bounded by the min-entropy of $(X, Y)$ decreased by the min-entropy of $Y$ conditioned on the side information.

Expand

Additional news items may be found on the IACR news page.