International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News item: 18 August 2012

Benjamin Fuller, Leonid Reyzin
ePrint Report ePrint Report
We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z).

We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of metric entropy by a factor of p and the quantity of metric entropy by log 1/p note that this means that the reduction in quantity and quality is the same, because the quantity of entropy is measured on logarithmic scale). Our result improves previous bounds of Dziembowski and Pietrzak (FOCS 2008), where the loss in the \\emph{quantity} of entropy was related to its original quality. The use of metric entropy simplifies the analogous the result of Reingold et. al. (FOCS 2008) for HILL entropy.

Further, we simplify dealing with information leakage by investigating conditional metric entropy. We show that, conditioned on leakage of \\lambda bits, metric entropy gets reduced by a factor 2^\\lambda in quality and \\lambda in quantity.

Expand

Additional news items may be found on the IACR news page.