International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2015-04-24
03:17 [Pub][ePrint] Security Analysis of PRINCE, by Jeremy Jean and Ivica Nikolic and Thomas Peyrin and Lei Wang and Shuang Wu

  In this article, we provide the first third-party security analysis of the

PRINCE lightweight block cipher, and the underlying PRINCE_core. First, while

no claim was made by the authors regarding related-key attacks, we show that

one can attack the full cipher with only a single pair of related keys, and

then reuse the same idea to derive an attack in the single-key model for the

full PRINCE_core for several instances of the $\\alpha$ parameter (yet not the

one randomly chosen by the designers). We also show how to exploit the

structural linear relations that exist for PRINCE in order to obtain a key

recovery attack that slightly breaks the security claims for the full cipher.

We analyze the application of integral attacks to get the best known

key-recovery attack on a reduced version of the PRINCE cipher. Finally, we

provide time-memory-data tradeoffs, that require only known

plaintext-ciphertext data, and that can be applied to full PRINCE.



03:17 [Pub][ePrint] Publicly Verifiable Software Watermarking, by Aloni Cohen and Justin Holmgren and Vinod Vaikuntanathan

  Software Watermarking is the process of transforming a program into a functionally equivalent \"marked\" program in such a way that it is computationally hard to remove the mark without destroying functionality. Barak, Goldreich, Impagliazzo, Rudich, Sahai, Vadhan and Yang (CRYPTO 2001) defined software watermarking and showed that the existence of indistinguishability obfuscation implies that software watermarking is impossible. Given the recent candidate constructions of indistinguishability obfuscation, this result paints a bleak picture for the possibility of meaningful watermarking.

We show that slightly relaxing the functionality requirement gives us strong positive results for watermarking. Namely, instead of requiring the marked program to agree with the original unmarked program on all inputs, we require only that they agree on a large fraction of inputs. With this relaxation in mind, our contributions are as follows.

1. We define publicly verifiable watermarking where marking a program requires a secret key, but anyone can verify that a program is marked. The handful of existing watermarking schemes are secretly verifiable, and moreover, satisfy only a weak definition where the ad- versary is restricted in the type of unmarked programs it is allowed to produce (Naccache, Shamir and Stern, PKC 1999; Nishimaki, EUROCRYPT 2013). Moreover, our definition requires security against chosen program attacks, where an adversary has access to an oracle that marks programs of her choice.

2. We construct a publicly verifiable watermarking scheme for any family of puncturable pseudo-random functions (PPRF), assuming indistinguishability obfuscation and injective one-way functions.

We also give an indication of the limits of watermarking by showing that the existence of robust totally unobfuscatable families of functions rules out a general watermarking scheme for cryptographic functionalities such as signatures and MACs.



03:17 [Pub][ePrint] On the Impossibility of Tight Cryptographic Reductions, by Christoph Bader and Tibor Jager and Yong Li and Sven Sch├Ąge

  The existence of tight reductions in cryptographic security proofs is an important question, motivated by the theoretical search for cryptosystems whose security guarantees are truly independent of adversarial behavior and the practical necessity of concrete security bounds for the theoretically-sound selection of cryptographic parameters.

At Eurocrypt 2002, Coron described a meta-reduction technique that allows to prove the impossibility of tight reductions for certain digital signature schemes.

This seminal result has found many further interesting applications.

However, due to a technical subtlety in the argument, the applicability of this technique beyond digital signatures in the single-user setting has turned out to be rather limited.

We describe a new meta-reduction technique for proving such impossibility results, which improves on known ones in several ways.

First, it enables interesting novel applications. This includes a formal proof that for certain cryptographic primitives (including public-key encryption/key encapsulation mechanisms and digital signatures), the security loss incurred when the primitive is transferred from an idealized single-user setting to the more realistic multi-user setting is impossible to avoid, and a lower tightness bound for non-interactive key exchange protocols. Second, the technique allows to rule out tight reductions from a very general class of non-interactive complexity assumptions. Third, the provided bounds are quantitatively and qualitatively better, yet simpler, than the bounds derived from Coron\'s technique and its extensions.





2015-04-23
15:17 [Pub][ePrint] Succinct Randomized Encodings and their Applications, by Nir Bitansky and Sanjam Garg and Huijia Lin and Rafael Pass and Sidharth Telang

  A {\\em randomized encoding} allows to express a ``complex\'\' computation, given by a function $f$ and input $x$, by a ``simple to compute\'\' randomized representation $\\hat{f}(x)$ whose distribution encodes $f(x)$, while revealing nothing else regarding $f$ and $x$. Existing randomized encodings, geared mostly to allow encoding with low parallel-complexity, have proven instrumental in various strong applications such as multiparty computation and parallel cryptography.

This work focuses on another natural complexity measure: {\\em the time

required to encode}. We construct {\\em succinct randomized

encodings} where the time to encode a computation, given by a

program $\\Pi$ and input $x$, is essentially independent of $\\Pi$\'s

time complexity, and only depends on its space complexity, as well as

the size of its input, output, and description. The scheme guarantees

computational privacy of $(\\Pi,x)$, and is based on

indistinguishability obfuscation for a relatively simple circuit

class, for which there exist instantiations based on polynomial

hardness assumptions on multi-linear maps.

We then invoke succinct randomized encodings to obtain several strong applications, including:

\\begin{itemize}

\\item

Succinct indistinguishability obfuscation, where the obfuscated program $iO({\\Pi})$ computes the same function as $\\Pi$ for inputs $x$ of apriori-bounded size. Obfuscating $\\Pi$ is roughly as fast as encoding the computation of $\\Pi$ on any such input $x$. Here we also require subexponentially-secure indistinguishability obfuscation for circuits.

\\item

Succinct functional encryption, where a functional decryption key corresponding to $\\Pi$ allows decrypting $\\Pi(x)$ from encryptions of any plaintext $x$ of apriori-bounded size. Key derivation is as fast as encoding the corresponding computation.

\\item

Succinct reusable garbling, a stronger form of randomized encodings where any number of inputs $x$ can be encoded separately of $\\Pi$, independently of $\\Pi$\'s time and space complexity.

\\item

Publicly-verifiable 2-message delegation where verifying the result of a long computation given by $\\Pi$ and input $x$ is as fast as encoding the corresponding computation. We also show how to transform any 2-message delegation scheme to an essentially non-interactive system where the verifier message is reusable.

\\end{itemize}

Previously, succinct randomized encodings or any of the above applications were only known based on various non-standard knowledge assumptions.

At the heart of our techniques is a generic method of compressing a

piecemeal garbled computation, without revealing anything about the

secret randomness utilized for garbling.



15:17 [Pub][ePrint] A Group-theory Method to The Cycle Structures of Feedback Shift Registers, by Ming Li, Yupeng Jiang and Dongdai Lin

  In this paper, we consider the cycle structures of feedback shift registers (FSRs). At the beginning, the cycle structures of two special classes of FSRs, pure circulating registers (PCRs) and pure summing registers (PSRs), are studied and it is proved that there are no other FSRs have the same cycle structure of an PCR (or PSR). Then, we regard $n$-stage FSRs as permutations over $2^n$ elements. According to the group theory, two permutations have the same cycle structure if and only if they are conjugate with each other. Since a conjugate of an FSR may no longer an FSR, it is interesting to consider the permutations that always transfer an FSR to an FSR. It is proved that there are exactly two such permutations, the identity mapping and the mapping that map every state to its dual. Furthermore, we prove that they are just the two permutations that transfer any maximum length FSR to an maximum length FSR.



15:17 [Pub][ePrint] On Generalized First Fall Degree Assumptions, by Yun-Ju Huang and Christophe Petit and Naoyuki Shinohara and Tsuyoshi Takagi

  The first fall degree assumption provides a complexity approximation of Gr\\\"obner basis algorithms when the degree of regularity of a polynomial system cannot be precisely evaluated. Most importantly, this assumption was recently used by Petit and Quisquater\'s to conjecture that the elliptic curve discrete logarithm problem can be solved in subexponential time for binary fields (binary ECDLP). The validity of the assumption may however depend on the systems in play.

In this paper, we theoretically and experimentally study the first fall degree assumption for a class of polynomial systems including those considered in Petit and Quisquater\'s analysis. In some cases, we show that the first fall degree assumption seems to hold and we deduce complexity improvements on previous binary ECDLP algorithms. On the other hand, we also show that the assumption is unlikely to hold in other cases where it would have very unexpected consequences.

Our results shed light on a Gr\\\"obner basis assumption with major consequences on several cryptanalysis problems, including binary ECDLP.



15:17 [Pub][ePrint] Higher-Order Side Channel Security and Mask Refreshing, by Jean-Sebastien Coron and Emmanuel Prouff and Matthieu Rivain and Thomas Roche

  Masking is a widely used countermeasure to protect block cipher implementations against side-channel attacks. The principle is to split every sensitive intermediate variable occurring in the computation into d + 1 shares, where d is called the masking order and plays the

role of a security parameter. A masked implementation is then said to achieve dth-order security if any set of d (or less) intermediate variables does not reveal key-dependent information. At CHES 2010, Rivain and Prouff have proposed a higher-order masking scheme for AES that works for any arbitrary order d. This scheme, and its subsequent extensions, are based on an improved version of the shared multiplication processing published by Ishai et al. at CRYPTO 2003. This improvement enables better memory/timing performances but its security relies on the refreshing of the masks at some points in the algorithm. In this paper, we show that the method proposed at CHES 2010 to do such mask refreshing introduces a security flaw in the overall masking scheme. Specically, we show that it is vulnerable to an attack of order d/2 + 1 whereas the scheme is supposed to achieve dth-order security. After exhibiting and analyzing the flaw, we propose a new solution which avoids the use of mask refreshing, and we prove its security. We also provide some implementation trick that makes our proposed solution, not only secure, but also faster than the original scheme.



15:17 [Pub][ePrint] Achieving Differential Privacy with New Imperfect Randomness, by Yanqing Yao and Zhoujun Li

  We revisit the question of achieving differential privacy with realistic imperfect randomness. In the design of differentially private mechanisms, it\'s usually assumed that uniformly random source is available. However, in many situations it seems unrealistic, and one must deal with various imperfect random sources. Dodis et al. (CRYPTO\'12) proposed that differential privacy can be achieved with Santha-Vazirani(SV) source via adding a stronger property called SV-consistent sampling and left open the question if differential privacy is possible with more realistic

(i.e., less structured) sources than SV source. A new source, called

Bias-Control Limited (BCL) source, introduced by Dodis (ICALP\'01),

as a generalization of the SV source and sequential bit-fixing source, is

more realistic. Unfortunately, if we nationally expand SV-consistent sampling to the BCL source, the expansion is hopeless to achieve differential privacy. One main reason is that SV-consistent sampling requires \"consecutive\"strings, while some strings can\'t be generated from \"non-trivial\"BCL source.

Motivated by this question, we introduce a new appealing property, called

compact BCL-consistent sampling, the degeneration of which is different

from SV-consistent sampling proposed by Dodis et al. We prove that if

the mechanism based on the BCL source satisfies this property, then it\'s

differentially private. Even if the BCL source is degenerated into the SV source,our proof is much more intuitive and simpler than that of Dodis

et al. Further, we construct explicit mechanisms using a new truncation

technique as well as arithmetic coding. We also propose its concrete

results for differential privacy and accuracy. While the results of [DY14]imply that if there exist differentially private mechanisms for imperfect randomness, then some parameters should have some constraints, ours show explicit construction of such mechanisms whose parameters match

the prior constraints.



15:17 [Pub][ePrint] Computationally binding quantum commitments, by Dominique Unruh

  We present a new definition of computationally binding commitment

schemes in the quantum setting, which we call \"collapse-binding\". The

definition applies to string commitments, composes in parallel, and

works well with rewinding-based proofs. We give simple constructions

of collapse-binding commitments in the random oracle model, giving

evidence that they can be realized from hash functions like SHA-3. We

evidence the usefulness of our definition by constructing three-round

statistical zero-knowledge quantum arguments of knowledge for all NP

languages.



15:17 [Pub][ePrint] Oblivious Transfer from weakly Random Self-Reducible Public-Key Cryptosystem, by Claude Crepeau and Raza Ali Kazmi

  In this work, we define a new notion of weakly Random-Self-Reducibile cryptosystems and show how it can be used to implement secure Oblivious Transfer. We also show that two recent (Post-quantum) cryptosystems (based on Learning with errors and Approximate Integer GCD) can be considered as weakly Random-Self-Reducible.



15:17 [Pub][ePrint] Optimally Secure Tweakable Blockciphers, by Bart Mennink

  We consider the generic design of a tweakable blockcipher from one or more evaluations of a classical blockcipher, in such a way that all input and output wires are of size n bits. As a first contribution, we show that any tweakable blockcipher with one primitive call and arbitrary linear pre- and postprocessing functions can be distinguished from an ideal one with an attack complexity of about 2^{n/2}. Next, we introduce the tweakable blockcipher tilde{F}[1]. It consists of one multiplication and one blockcipher call with tweak-dependent key, and achieves 2^{2n/3} security. Finally, we introduce tilde{F}[2], which makes two blockcipher calls, one of which with tweak-dependent key, and achieves optimal 2^n security. Both schemes are more efficient than all existing beyond birthday bound tweakable blockciphers known to date, as long as one blockcipher key renewal is cheaper than one blockcipher evaluation plus one universal hash evaluation.